WO2024042668A1 - Élément photodétecteur - Google Patents

Élément photodétecteur Download PDF

Info

Publication number
WO2024042668A1
WO2024042668A1 PCT/JP2022/031992 JP2022031992W WO2024042668A1 WO 2024042668 A1 WO2024042668 A1 WO 2024042668A1 JP 2022031992 W JP2022031992 W JP 2022031992W WO 2024042668 A1 WO2024042668 A1 WO 2024042668A1
Authority
WO
WIPO (PCT)
Prior art keywords
circuit
light receiving
element according
interval
allocation
Prior art date
Application number
PCT/JP2022/031992
Other languages
English (en)
Japanese (ja)
Inventor
健一郎 安城
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Priority to PCT/JP2022/031992 priority Critical patent/WO2024042668A1/fr
Publication of WO2024042668A1 publication Critical patent/WO2024042668A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/08Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors
    • H01L31/10Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors characterised by potential barriers, e.g. phototransistors
    • H01L31/101Devices sensitive to infrared, visible or ultraviolet radiation
    • H01L31/102Devices sensitive to infrared, visible or ultraviolet radiation characterised by only one potential barrier
    • H01L31/107Devices sensitive to infrared, visible or ultraviolet radiation characterised by only one potential barrier the potential barrier working in avalanche mode, e.g. avalanche photodiodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/78Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters

Definitions

  • the present disclosure relates to a photodetecting element.
  • An integral type image sensor that counts the number of photons incident on a light receiving element for each exposure time is known as an imaging device.
  • an imaging device In recent years, the use of SPAD (Single-Photon Avalanche Diode), which is one type of avalanche photodiode, has been considered.
  • the response of the SPAD corresponds to 1-bit AD conversion
  • the record of the number of reactions corresponds to the record of the number of photons.
  • optical shot noise a phenomenon in which the number of photons varies depending on the exposure time due to the particle nature of light, so-called optical shot noise.
  • optical shot noise mixes into image data, there is a concern that image quality will deteriorate.
  • the present disclosure provides a photodetection element that can suppress deterioration in image quality due to particle nature of light.
  • a photodetection element includes: a pixel circuit including a light-receiving element that reacts to photons contained in incident light, and outputting a pixel signal indicating that the light-receiving element has reacted to the photon; an allocation circuit that specifies the interval between the pixel signals for each exposure time and allocates the specified interval according to the size; a latch circuit that stores data indicating the size of the interval in a distributed manner in a plurality of storage areas based on allocation by the allocation circuit; Equipped with.
  • the photodetecting element further includes a data processing circuit that creates a distribution of the intervals using the data and calculates an average value of the number of photons incident on the light receiving element per exposure time based on the created distribution. You may be prepared.
  • the data processing circuit may calculate the average value by calculating an exponential distribution coefficient.
  • the plurality of light receiving elements are arranged in a matrix within the pixel circuit,
  • the allocation circuit and the latch circuit may be provided on a column-by-column, pixel-by-pixel, or row-by-row basis.
  • the allocation circuit is a plurality of flip-flops arranged in a column, input with a clock signal set to a predetermined frequency, and reset with the pixel signal; a plurality of logic circuits that perform logical operations on the clock signal input to each of the plurality of flip-flops and the pixel signal to allocate the interval to the plurality of storage areas; It may have.
  • the logic circuit may be a NAND circuit or a NOR circuit.
  • the latch circuit may include at least one storage element for each of the plurality of storage areas.
  • the memory element may be an SRAM (Static Random Access Memory).
  • the number of the storage elements may be different between the plurality of storage areas.
  • the number of storage elements that store data with the smallest interval may be the largest.
  • the number of storage elements that store data with the largest interval may be the smallest.
  • the data processing circuit may operate in a first mode in which the average value is calculated or in a second mode in which all the data stored in the latch circuit are added up, depending on the illuminance of the incident light.
  • the data processing circuit may operate in the first mode when the illuminance is greater than a preset threshold, and operate in the second mode when the illuminance is less than or equal to the threshold. good.
  • the photodetecting element may further include a light receiving chip on which the light receiving element is arranged, and a logic chip stacked on the light receiving chip and on which the allocation circuit and the latch circuit are arranged.
  • the light receiving element may be a SPAD.
  • FIG. 1 is a block diagram showing a schematic configuration of an imaging device according to a first embodiment.
  • FIG. 3 is a diagram showing an example of a stacked structure of a photodetecting element according to a first embodiment.
  • FIG. 2 is a plan view showing an example of the configuration of the light receiving chip according to the first embodiment.
  • 1 is a block diagram showing an example of the configuration of a logic chip according to a first embodiment.
  • FIG. FIG. 2 is a circuit diagram showing an example of a configuration of a pixel circuit in the first embodiment.
  • FIG. 2 is a circuit diagram showing an example of the configuration of an allocation circuit and a latch circuit.
  • FIG. 2 is a schematic diagram showing an example of how photons enter a light receiving element.
  • FIG. 1 is a block diagram showing a schematic configuration of an imaging device according to a first embodiment.
  • FIG. 3 is a diagram showing an example of a stacked structure of a photodetecting element according to a first embodiment.
  • FIG. 2 is a
  • FIG. 3 is a diagram showing an example of the distribution of the number of photons incident on a light receiving element per exposure time.
  • FIG. 3 is a diagram illustrating an example of a distribution of intervals between pixel signals.
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system.
  • FIG. 2 is an explanatory diagram showing an example of installation positions of an outside-vehicle information detection section and an imaging section.
  • FIG. 1 is a block diagram showing a schematic configuration of an imaging device according to a first embodiment.
  • the imaging device 100 includes an imaging lens 110, a photodetection element 200, a recording section 120, and an imaging control section 130.
  • the imaging device 100 can be applied to, for example, a smartphone, a digital camera, a personal computer, a vehicle-mounted camera, or an IoT (Internet of Things) camera.
  • IoT Internet of Things
  • the imaging lens 110 collects incident light and guides it to the photodetector element 200.
  • the photodetecting element 200 captures image data under the control of the imaging control unit 130. This photodetecting element 200 supplies captured image data to the recording unit 120 via a signal line 209.
  • the recording unit 120 records image data.
  • the imaging control unit 130 controls the photodetection element 200 to capture image data.
  • the imaging control unit 130 supplies, for example, a clock signal such as a vertical synchronization signal, an exposure control signal for controlling an exposure period, and an exposure time to the photodetection element 200 via a signal line 139.
  • the imaging device 100 may further include an interface, and may transmit image data to the outside through the interface, or may further include a display section, and may display image data on the display section.
  • FIG. 2 is a diagram showing an example of a laminated structure of the photodetecting element 200 according to the first embodiment.
  • the photodetecting element 200 according to this embodiment includes a light receiving chip 201 and a logic chip 202 stacked on the light receiving chip 201. A signal line for transmitting signals is provided between these chips.
  • FIG. 3 is a plan view showing an example of the configuration of the light receiving chip 201 according to the first embodiment.
  • the light receiving chip 201 shown in FIG. 3 is provided with a light receiving section 210.
  • a plurality of light receiving elements 220 are arranged in a matrix.
  • the light receiving element 220 photoelectrically converts incident light and outputs a photocurrent.
  • the light receiving element 220 is, for example, an avalanche photodiode that can detect whether or not one photon is incident by amplifying photocurrent. In this embodiment, it is desirable to use SPAD.
  • FIG. 4 is a block diagram showing an example of the configuration of the logic chip 202 according to the first embodiment.
  • a vertical control section 230 In the logic chip 202 shown in FIG. 4, a vertical control section 230, a logic array section 240, an allocation circuit 260, a latch circuit 270, and a data processing circuit 280 are arranged.
  • a clock signal is input to the vertical control section 230.
  • the frequency of the clock signal is set to, for example, 100 MHz.
  • the vertical control unit 230 controls the logic array unit 240 based on the clock signal.
  • An exposure control signal from the imaging control section 130 is input to the logic array section 240. Further, in the logic array section 240, a plurality of readout circuits 250 are arranged in a matrix. Each readout circuit 250 is provided for each light receiving element 220. Each readout circuit 250 is connected to a corresponding light receiving element 220 via a signal line. The light receiving element 220 and the readout circuit 250 function as a pixel circuit that generates a pixel signal for one pixel in image data. This pixel signal corresponds to a photon arrival signal indicating that the light receiving element 220 has responded to a photon contained in the incident light.
  • the allocation circuit 260 specifies the interval of pixel signals (photon arrival signals). Further, the allocation circuit 260 allocates data indicating the specified interval to a storage area previously allocated to the latch circuit 270 according to the size of the interval. The configuration of the allocation circuit 260 will be described later.
  • the latch circuit 270 stores data indicating the interval between pixel signals. At this time, the latch circuit 270 stores the data in the storage area allocated by the allocation circuit 260. The configuration of the latch circuit 270 will also be described later.
  • the data processing circuit 280 estimates the average number of photons that entered the light receiving element 220 within the exposure period using the data stored in the latch circuit 270. A method for estimating the average value of the number of photons will be described later.
  • FIG. 5 is a circuit diagram showing an example of the configuration of a pixel circuit in the first embodiment.
  • the pixel circuit 300 shown in FIG. 5 includes a light receiving element 220 and a readout circuit 250.
  • This readout circuit 250 includes a resistor 251, an inverter 252, and a switch 253.
  • the cathode is connected to the power terminal via the resistor 251, and the anode is connected to the ground terminal.
  • photocurrent flows in the direction from the cathode to the anode.
  • one end of the resistor 251 is connected to the power supply terminal, and the other end is connected to the cathode of the light receiving element 220. Every time the light receiving element 220 detects the incidence of a photon, a photocurrent flows through the resistor 251, and the cathode potential of the light receiving element 220 drops to an initial state value lower than the power supply potential.
  • the inverter 252 inverts the cathode potential signal of the light receiving element 220 and outputs it as a pulse signal.
  • This pulse signal corresponds to a pixel signal.
  • This inverter 252 outputs a low level pulse signal when the cathode potential is higher than a predetermined value, and outputs a high level pulse signal when the cathode potential is below the predetermined value.
  • the signal of the anode potential of the light receiving element 220 may be used as the pixel signal. In this case, the anode of the light receiving element 220 is connected to the input terminal of the inverter 252.
  • the switch 253 is turned on and off based on the control of the vertical control section 230.
  • the switch 253 is turned on, the pixel signal is output to the allocation circuit 260 via the vertical signal line 290.
  • the vertical signal line 290 is commonly connected to a plurality of pixel circuits 300 arranged in the column direction (vertical direction). Therefore, in this embodiment, each allocation circuit 260 is arranged in units of pixel columns. Note that each allocation circuit 260 may be arranged in units of pixel rows.
  • FIG. 6 is a circuit diagram showing an example of the configuration of the allocation circuit 260 and the latch circuit 270.
  • the allocation circuit 260 is a ripple counter type circuit, and includes a first flip-flop 261a to a fifth flip-flop 261e, and a first NAND circuit 262a to a sixth NAND circuit 262f. Note that although this allocation circuit 260 has five flip-flops, the number of flip-flops may be more than one. Further, the number of NAND circuits may be one more than the number of flip-flops. Further, the allocation circuit 260 may include a NOR circuit instead of a NAND circuit.
  • the first flip-flop 261a to the fifth flip-flop 261e are arranged in a row.
  • the first flip-flop 261a is the first flip-flop
  • the fifth flip-flop 261e is the last flip-flop.
  • a clock signal is input to the first flip-flop 261a.
  • a clock signal is input to the second flip-flop 261b to the fifth flip-flop 261e via the flip-flop in the previous stage.
  • each flip-flop is reset by a high-level pixel signal indicating that a photon has reached (detected) the light receiving element 220.
  • the first NAND circuit 262a performs a NAND logical operation on the clock signal input to the first flip-flop 261a and the pixel signal.
  • the second NAND circuit 262b performs a NAND logical operation on the clock signal input to the second flip-flop 261b via the first flip-flop 261a and the pixel signal.
  • the third NAND circuit 262c performs a NAND logical operation on the clock signal input to the third flip-flop 261c via the first flip-flop 261a and the second flip-flop 261b and the pixel signal.
  • the fourth NAND circuit 262d performs a NAND logical operation on the clock signal input to the fourth flip-flop 261d via the first flip-flop 261a to the third flip-flop 261c and the pixel signal.
  • the fifth NAND circuit 262e performs a NAND logical operation on the clock signal input to the fifth flip-flop 261e via the first flip-flop 261a to the fourth flip-flop 261d and the pixel signal.
  • the sixth NAND circuit 262f performs a NAND logical operation on the clock signal output from the fifth flip-flop 261e and the pixel signal.
  • the latch circuit 270 has a first memory element 271a to a sixth memory element 271f.
  • Each storage element is composed of, for example, an SRAM (Static Random Access Memory).
  • the first storage element 271a is connected to the output terminal of the first NAND circuit 262a.
  • the second memory element 271b to the sixth memory element 271e are connected to the output terminals of the second NAND circuit 262b to the sixth NAND circuit 262e, respectively.
  • the number of elements decreases stepwise in the order of the first memory element 271a to the sixth memory element 271f.
  • five bits of data can be stored in the five first storage elements 271a.
  • three bits of data can be stored in each of the three second storage elements 271b and the third storage element 271c.
  • two bits of data can be stored in the two fourth storage elements 271d.
  • one fifth storage element 271e and one sixth storage element 271f can each store one bit of data.
  • the number of each storage element in other words, the number of bits that can be stored in the latch circuit 270 is not limited to this.
  • the number of each storage element that is, the number of bits of each storage area can be determined based on the exponential cumulative distribution at the maximum amount of light that can be received by the light receiving element 220.
  • the latch circuit 270 of this embodiment is divided in advance into six storage areas, a first storage element 271a to a sixth storage element 271f, according to the interval ⁇ t of pixel signals. Then, the allocation circuit 260 detects the interval ⁇ t between the pixel signals and allocates the storage elements to be stored according to the detected interval ⁇ t. In the allocation circuit 260, for example, the high level of the output signal of the NAND circuit changes to the most significant bit (MSB) side every time 10ns, 20ns, 40ns, etc. pass until the next reset of the flip-flop. I will move on to.
  • MSB most significant bit
  • the first data indicating that the interval ⁇ t is 10 ns or less is stored in the first storage element 271a connected to the output terminal of the first NAND circuit 262a.
  • the second data indicating that the interval ⁇ t is greater than 10 ns and less than 20 ns is stored in the second storage element 271b connected to the output terminal of the second NAND circuit 262b.
  • the third data indicating that the interval ⁇ t is greater than 20 ns and less than 40 ns is stored in the third storage element 271c connected to the output terminal of the third NAND circuit 262c.
  • the fourth data indicating that the interval ⁇ t is greater than 40 ns and less than 80 ns is stored in the fourth storage element 271d connected to the output terminal of the fourth NAND circuit 262d.
  • the fifth data indicating that the interval ⁇ t is greater than 80 ns and less than 160 ns is stored in the fifth storage element 271e connected to the output terminal of the fifth NAND circuit 262e.
  • the sixth data indicating that the interval ⁇ t is greater than 160 ns and less than 320 ns is stored in the sixth storage element 271f connected to the output terminal of the sixth NAND circuit 262f.
  • the data processing circuit 280 estimates the average value of the number of photons that entered the light receiving element 220 within the exposure time.
  • a method for estimating the average value of the number of photons will be explained.
  • FIG. 7 is a schematic diagram showing an example of how photons enter a light receiving element.
  • photons 400 are incident on the light receiving element 220 at every exposure time T.
  • a phenomenon in which the number of photons 400 incident on the light receiving element 220 per exposure time T varies, that is, so-called optical shot noise may occur.
  • FIG. 8 is a diagram showing an example of the distribution of the number of photons 400 incident on the light receiving element 220 per exposure time T.
  • the data processing circuit 280 uses the data stored in the latch circuit 270 to create a histogram of the intervals ⁇ t of pixel signals.
  • FIG. 9 is a diagram showing an example of the distribution of the interval ⁇ t of pixel signals.
  • the interval ⁇ t between the pixel signals has an exponential distribution that can be expressed by a function of ⁇ exp ⁇ .
  • the data processing circuit 280 calculates the exponential distribution coefficient ⁇ by performing numerical calculation processing such as the method of least squares on the histogram. This coefficient ⁇ corresponds to the average value ⁇ mentioned above. Thereby, the average value ⁇ of the photons 400 incident on the light receiving element 220 per exposure time T can be estimated.
  • the allocation circuit 260 assigns data indicating the interval ⁇ t of the photons 400 that reached (detected) the light receiving element 220 within the exposure time to each memory of the latch circuit 270 according to the size. Can be distributed over areas. Therefore, the data processing circuit 280 can calculate the average value ⁇ of the number of photons reaching the light receiving element 220 within the exposure time using the data read out for each storage area of the latch circuit 270. By directly determining the average value ⁇ of the number of photons as in this embodiment, the influence of the noise of the light shot is theoretically eliminated.
  • the allocation circuit 260 and the latch circuit 270 are provided for each pixel column. However, if the circuit can be miniaturized, the allocation circuit 260 and the latch circuit 270 may be placed within the pixel circuit 300. In this case, allocation circuit 260 and latch circuit 270 are provided for each pixel. Therefore, data allocation processing by the allocation circuit 260 and storage processing by the latch circuit 270 can be performed independently for each pixel, making it possible to speed up the processing.
  • the data processing circuit 280 calculates the average value ⁇ of the number of photons incident on the light receiving element 220, regardless of the illuminance of the incident light.
  • the average value ⁇ approaches zero as the illuminance of the incident light decreases. Therefore, in a low-illuminance environment, the influence of the optical shot noise described in the first embodiment is also reduced.
  • the sum of all the data stored in the first to sixth storage elements 271a to 271f of the latch circuit 270 is equivalent to the number of photons within the exposure time T.
  • the operation of the data processing circuit 280 is switched between a first mode in which the average value ⁇ is calculated and a second mode in which all the data from the latch circuit 270 are added together, depending on the illuminance of the incident light. For example, in a high-illuminance environment where the illuminance of incident light is greater than a preset threshold, the data processing circuit 280 operates in the first mode. Conversely, in a low-illuminance environment where the illuminance of incident light is below the threshold, the data processing circuit 280 operates in the second mode. At this time, if the imaging device 100 side recognizes in advance that the imaging condition is a high-illuminance environment or a low-illuminance environment, the data processing circuit 280 controls the first mode or a second mode.
  • the operation of the data processing circuit 280 is switched between the first mode and the second mode depending on the illuminance of the incident light. Since the second mode is a process of adding all the data in the latch circuit 270, the process of creating a histogram becomes unnecessary. Therefore, the processing time is reduced compared to the first mode. Furthermore, since the second mode is performed in a low-light environment, it is less susceptible to optical shot noise. On the other hand, in a high-illuminance environment susceptible to optical shot noise, the data processing circuit 280 operates in a first mode in which the average value ⁇ of the number of photons is calculated.
  • the technology according to the present disclosure (this technology) can be applied to various products.
  • the technology according to the present disclosure may be realized as a device mounted on any type of moving body such as a car, electric vehicle, hybrid electric vehicle, motorcycle, bicycle, personal mobility, airplane, drone, ship, robot, etc. It's okay.
  • FIG. 10 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output section 12052, and an in-vehicle network I/F (Interface) 12053 are illustrated as the functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a drive force generation device such as an internal combustion engine or a drive motor that generates drive force for the vehicle, a drive force transmission mechanism that transmits the drive force to wheels, and a drive force transmission mechanism that controls the steering angle of the vehicle. It functions as a control device for a steering mechanism that adjusts and a braking device that generates braking force for the vehicle.
  • the body system control unit 12020 controls the operations of various devices installed in the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a turn signal, or a fog lamp.
  • radio waves transmitted from a portable device that replaces a key or signals from various switches may be input to the body control unit 12020.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls the door lock device, power window device, lamp, etc. of the vehicle.
  • the external information detection unit 12030 detects information external to the vehicle in which the vehicle control system 12000 is mounted.
  • an imaging section 12031 is connected to the outside-vehicle information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the external information detection unit 12030 may perform object detection processing such as a person, car, obstacle, sign, or text on the road surface or distance detection processing based on the received image.
  • the imaging unit 12031 is a sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electrical signal as an image or as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • a driver condition detection section 12041 that detects the condition of the driver is connected to the in-vehicle information detection unit 12040.
  • the driver condition detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver condition detection unit 12041. It may be calculated, or it may be determined whether the driver is falling asleep.
  • the microcomputer 12051 calculates control target values for the driving force generation device, steering mechanism, or braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, Control commands can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of autonomous driving, etc., which does not rely on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12030 based on the information outside the vehicle acquired by the outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control for the purpose of preventing glare, such as switching from high beam to low beam. It can be carried out.
  • the audio and image output unit 12052 transmits an output signal of at least one of audio and images to an output device that can visually or audibly notify information to the occupants of the vehicle or to the outside of the vehicle.
  • an audio speaker 12061, a display section 12062, and an instrument panel 12063 are illustrated as output devices.
  • Display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
  • FIG. 11 is a diagram showing an example of the installation position of the imaging section 12031.
  • the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at, for example, the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper part of the windshield inside the vehicle.
  • An imaging unit 12101 provided in the front nose and an imaging unit 12105 provided above the windshield inside the vehicle mainly acquire images in front of the vehicle 12100.
  • Imaging units 12102 and 12103 provided in the side mirrors mainly capture images of the sides of the vehicle 12100.
  • An imaging unit 12104 provided in the rear bumper or back door mainly captures images of the rear of the vehicle 12100.
  • the imaging unit 12105 provided above the windshield inside the vehicle is mainly used to detect preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 11 shows an example of the imaging range of the imaging units 12101 to 12104.
  • An imaging range 12111 indicates an imaging range of the imaging unit 12101 provided on the front nose
  • an imaging range 1211212113 indicates an imaging range of imaging units 12102 and 12103 provided on the side mirrors
  • an imaging range 12114 indicates an imaging range of the rear bumper or The imaging range of the imaging unit 12104 provided in the back door is shown.
  • an overhead image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of image sensors, or may be an image sensor having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104. By determining the following, it is possible to extract, in particular, the closest three-dimensional object on the path of vehicle 12100, which is traveling at a predetermined speed (for example, 0 km/h or more) in approximately the same direction as vehicle 12100, as the preceding vehicle. can. Furthermore, the microcomputer 12051 can set an inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of autonomous driving, etc., in which the vehicle travels autonomously without depending on the driver's operation.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • the microcomputer 12051 transfers three-dimensional object data to other three-dimensional objects such as two-wheeled vehicles, regular vehicles, large vehicles, pedestrians, and utility poles based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic obstacle avoidance. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceeds a set value and there is a possibility of a collision, the microcomputer 12051 transmits information via the audio speaker 12061 and the display unit 12062. By outputting a warning to the driver via the vehicle control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceed
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether the pedestrian is present in the images captured by the imaging units 12101 to 12104.
  • pedestrian recognition involves, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and a pattern matching process is performed on a series of feature points indicating the outline of an object to determine whether it is a pedestrian or not.
  • the audio image output unit 12052 creates a rectangular outline for emphasis on the recognized pedestrian.
  • the display unit 12062 is controlled to display the .
  • the audio image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to, for example, the imaging unit 12031 among the configurations described above.
  • the imaging device 100 can be applied to the imaging unit 12031.
  • a pixel circuit that includes a light receiving element that reacts to photons contained in incident light and outputs a pixel signal indicating that the light receiving element has reacted to the photon; an allocation circuit that specifies the interval between the pixel signals for each exposure time and allocates the specified interval according to the size; a latch circuit that stores data indicating the size of the interval in a distributed manner in a plurality of storage areas based on allocation by the allocation circuit;
  • a photodetecting element comprising: (2) further comprising a data processing circuit that creates a distribution of the intervals using the data and calculates an average value of the number of photons incident on the light receiving element per exposure time based on the created distribution; 1) The photodetector element described in 1).
  • the distribution of the intervals is an exponential distribution
  • the plurality of light receiving elements are arranged in a matrix within the pixel circuit;
  • the allocation circuit is a plurality of flip-flops arranged in a column, input with a clock signal set to a predetermined frequency, and reset with the pixel signal; a plurality of logic circuits that perform logical operations on the clock signal input to each of the plurality of flip-flops and the pixel signal to allocate the interval to the plurality of storage areas;
  • the data processing circuit operates in a first mode in which the average value is calculated or in a second mode in which all the data stored in the latch circuit are added up, depending on the illuminance of the incident light.
  • the data processing circuit operates in the first mode when the illuminance is greater than a preset threshold, and operates in the second mode when the illuminance is less than or equal to the threshold.
  • Photodetection element 201 Light receiving chip 202: Logic chip 220: Light receiving element 260: Allocation circuits 261a to 261e: First flip-flop to fifth flip-flop 262a to 262f: First NAND circuit to sixth NAND circuit 270: Latch circuit 271a ⁇ 271f: First memory element ⁇ Sixth memory element 280: Data processing circuit 300: Pixel circuit

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

Un élément photodétecteur selon un mode de réalisation de la présente divulgation comprend : un circuit de pixel qui comprend un élément de réception de lumière réactif à des photons inclus dans la lumière incidente et qui délivre des signaux de pixel indiquant que l'élément de réception de lumière a réagi à des photons ; des circuits d'attribution qui identifient des intervalles entre les signaux de pixel pour une unité d'un temps d'exposition et qui attribuent les intervalles identifiés en fonction de la taille de ceux-ci ; et des circuits de verrouillage qui stockent des données indiquant la taille des intervalles, par dispersion des données dans une pluralité de régions de stockage sur la base de l'attribution par les circuits d'attribution.
PCT/JP2022/031992 2022-08-25 2022-08-25 Élément photodétecteur WO2024042668A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/031992 WO2024042668A1 (fr) 2022-08-25 2022-08-25 Élément photodétecteur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/031992 WO2024042668A1 (fr) 2022-08-25 2022-08-25 Élément photodétecteur

Publications (1)

Publication Number Publication Date
WO2024042668A1 true WO2024042668A1 (fr) 2024-02-29

Family

ID=90012775

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/031992 WO2024042668A1 (fr) 2022-08-25 2022-08-25 Élément photodétecteur

Country Status (1)

Country Link
WO (1) WO2024042668A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020088520A (ja) * 2018-11-21 2020-06-04 キヤノン株式会社 光電変換装置及び撮像システム
WO2022097522A1 (fr) * 2020-11-05 2022-05-12 ソニーセミコンダクタソリューションズ株式会社 Capteur de télémétrie et système de télémétrie
JP2022073105A (ja) * 2020-10-30 2022-05-17 ソニーセミコンダクタソリューションズ株式会社 受光装置、受光装置の制御方法、および、測距システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020088520A (ja) * 2018-11-21 2020-06-04 キヤノン株式会社 光電変換装置及び撮像システム
JP2022073105A (ja) * 2020-10-30 2022-05-17 ソニーセミコンダクタソリューションズ株式会社 受光装置、受光装置の制御方法、および、測距システム
WO2022097522A1 (fr) * 2020-11-05 2022-05-12 ソニーセミコンダクタソリューションズ株式会社 Capteur de télémétrie et système de télémétrie

Similar Documents

Publication Publication Date Title
US11659304B2 (en) Solid-state imaging element, imaging device, and control method of solid-state imaging element
US11950009B2 (en) Solid-state image sensor
US11330202B2 (en) Solid-state image sensor, imaging device, and method of controlling solid-state image sensor
US20230179880A1 (en) Image sensor and electronic device
US11823466B2 (en) Object detection device, object detection system, and object detection method
WO2021117350A1 (fr) Élément d'imagerie à semi-conducteurs et dispositif d'imagerie
US20200057149A1 (en) Optical sensor and electronic device
JP7414440B2 (ja) 測距センサ
US20220006965A1 (en) Solid-state imaging device, imaging apparatus, and method for controlling solid-state imaging device
US11076148B2 (en) Solid-state image sensor, imaging apparatus, and method for controlling solid-state image sensor
US20240163588A1 (en) Solid-state imaging element and imaging device
WO2024042668A1 (fr) Élément photodétecteur
WO2022239418A1 (fr) Capteur de télémétrie et dispositif de télémétrie
WO2022270034A1 (fr) Dispositif d'imagerie, dispositif électronique et procédé de détection de lumière
WO2023195262A1 (fr) Dispositif de détection de lumière, dispositif de télémétrie et appareil électronique
WO2022254792A1 (fr) Élément de réception de lumière, procédé de commande associé et système de mesure de distance
WO2023286403A1 (fr) Dispositif de détection de lumière et système de mesure de distance
US20230228875A1 (en) Solid-state imaging element, sensing system, and control method of solid-state imaging element
US20230062562A1 (en) Sensing system and distance measuring system
US11483499B2 (en) Imaging apparatus for addition of pixels on the basis of a temperature of an imaging array
WO2021261079A1 (fr) Dispositif de détection de lumière et système de mesure de distance
WO2023171176A1 (fr) Élément récepteur de lumière et dispositif électronique
WO2022254832A1 (fr) Appareil de capture d'image, dispositif électronique et procédé de capture d'image
WO2023145344A1 (fr) Élément de réception de lumière et appareil électronique
JP2024067906A (ja) 光検出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22956495

Country of ref document: EP

Kind code of ref document: A1