WO2022254792A1 - Élément de réception de lumière, procédé de commande associé et système de mesure de distance - Google Patents

Élément de réception de lumière, procédé de commande associé et système de mesure de distance Download PDF

Info

Publication number
WO2022254792A1
WO2022254792A1 PCT/JP2022/004585 JP2022004585W WO2022254792A1 WO 2022254792 A1 WO2022254792 A1 WO 2022254792A1 JP 2022004585 W JP2022004585 W JP 2022004585W WO 2022254792 A1 WO2022254792 A1 WO 2022254792A1
Authority
WO
WIPO (PCT)
Prior art keywords
voltage
light
receiving element
control
pixel
Prior art date
Application number
PCT/JP2022/004585
Other languages
English (en)
Japanese (ja)
Inventor
正隆 佐藤
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2022254792A1 publication Critical patent/WO2022254792A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N25/74Circuitry for scanning or addressing the pixel array
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range

Definitions

  • the present disclosure relates to a light-receiving element, its driving method, and a distance measurement system, and in particular, a light-receiving element and its driving method that can improve distance measurement accuracy during low-speed operation while maintaining transfer efficiency during high-speed operation. It also relates to a ranging system.
  • a light receiving element that uses the indirect ToF (Time of Flight) method is known.
  • an indirect ToF photodetector signal charges generated by photoelectric conversion of reflected light are distributed to two charge storage units by, for example, two transfer gates, and the distance is calculated from the distribution ratio of the signal charges. .
  • a sensitivity modulation element that prevents electrons from flowing back to a photodiode by causing the charges transferred to the two charge storage portions to periodically overflow to the pixel drain electrode (for example, See Patent Document 1).
  • the indirect ToF type light receiving element needs to be operated with a high modulation frequency, and the amount of accumulated charge in the photodiode may be small compared to conventional viewing sensors. Therefore, surplus charges larger than the saturated charge amount of the photodiode may overflow and be unintentionally transferred to the charge accumulation section on the opposite side of the transfer direction of the main body. In this case, the ranging accuracy is lowered.
  • the present disclosure has been made in view of such circumstances, and is intended to improve the distance measurement accuracy during low speed operation while maintaining transfer efficiency during high speed operation.
  • the light receiving element of the first aspect of the present disclosure includes a pixel having a photoelectric conversion unit and a transfer transistor that transfers charges generated by the photoelectric conversion unit; and a voltage control unit that switches and controls a plurality of voltage values as an off-control voltage for turning off the transfer transistor.
  • a method for driving a light receiving element includes: A light-receiving element comprising a pixel having a photoelectric conversion unit and a transfer transistor for transferring charges generated by the photoelectric conversion unit, A plurality of voltage values are switched and controlled as an off-control voltage for turning off the transfer transistor.
  • the ranging system of the third aspect of the present disclosure comprises: a light emitting unit that emits irradiation light; a light-receiving element that receives the reflected light returned by the irradiation light reflected by an object,
  • the light receiving element is a pixel having a photoelectric conversion unit and a transfer transistor that transfers charges generated by the photoelectric conversion unit; and a voltage control unit that switches and controls a plurality of voltage values as an off-control voltage for turning off the transfer transistor.
  • a light-receiving element including a pixel having a photoelectric conversion unit and a transfer transistor that transfers charges generated by the photoelectric conversion unit, off control for turning off the transfer transistor As the voltage, a plurality of voltage values are switch-controlled.
  • the light receiving element and the distance measuring system may be independent devices, or may be modules incorporated into other devices.
  • FIG. 1 is a block diagram showing a configuration example of a ranging system according to an embodiment of the present disclosure
  • FIG. 2 is a block diagram showing a configuration example of a light receiving element in FIG. 1
  • FIG. It is a figure which shows the circuit structural example of a pixel. It is a figure explaining light emission of irradiation light, and a drive of a pixel. It is a figure explaining the calculation method of the depth value by a 2-Phase method and a 4-Phase method. It is a figure explaining the calculation method of the depth value by a 2-Phase method and a 4-Phase method. It is a figure explaining the calculation method of the depth value by a 2-Phase method and a 4-Phase method. It is a figure explaining the calculation method of the depth value by a 2-Phase method and a 4-Phase method.
  • FIG. 4 is a diagram for explaining driving of transfer transistors during high-speed operation;
  • FIG. 10 is a diagram illustrating driving of transfer transistors during low-speed operation compared with high-speed operation;
  • FIG. 4 is a diagram summarizing applied voltage control when a transfer transistor is turned off;
  • 4 is a block diagram showing a configuration example of an off-voltage control section;
  • FIG. 4 is a flowchart for explaining off-voltage control processing;
  • 1 is a block diagram showing a configuration example of a smartphone as an electronic device equipped with a ranging system of the present disclosure;
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system;
  • FIG. FIG. 4 is an explanatory diagram showing an example of installation positions of an outside information detection unit and an imaging unit;
  • FIG. 1 is a block diagram showing a configuration example of a ranging system according to an embodiment of the present disclosure.
  • the ranging system 1 of FIG. 1 includes a control device 11, a light receiving device 12, and an illumination device 13.
  • the light receiving device 12 includes a light receiving element 31 and a lens 32 .
  • the illumination device 13 has an LD 21 and a light emitting section 22 .
  • the control device 11 is a device that controls the operation of the distance measuring system 1 as a whole.
  • the control device 11 specifies an operation mode and supplies a measurement request to the light receiving device 12 . Further, the control device 11 acquires from the light receiving device 12 the measurement data generated based on the result of the light received by the light receiving device 12 in response to the measurement request.
  • the control device 11 may be provided outside the ranging system 1 instead of being part of the configuration of the ranging system 1, for example, as a control unit of a host device in which the ranging system 1 is incorporated.
  • the operation modes that can be specified by the control device 11 to the light receiving device 12 include a ranging mode for measuring the distance to an object by the indirect ToF method, and a 2D image shooting mode for generating a two-dimensional luminance image.
  • There is Indirect ToF distance measurement is a distance measurement that calculates the distance to an object by detecting the flight time from the timing when the irradiated light is emitted to the timing when the reflected light is received as a phase difference.
  • the 2D image shooting mode is a mode for outputting a two-dimensional luminance image according to the amount of light received, like a normal image sensor.
  • the light receiving element 31 of the light receiving device 12 supplies a light emission pulse to the LD 21 and causes the light emitting section 22 to emit irradiation light.
  • the light receiving element 31 receives reflected light from subjects such as the objects 41 and 42 through the lens 32 .
  • the lens 32 forms an image on the light-receiving surface of the light-receiving element 31 using the reflected light reflected from the object as incident light. Note that the configuration of the lens 32 is arbitrary, and for example, the lens 32 can be configured with a plurality of lens groups.
  • the light emission pulse is a timing signal that serves as a reference for the light emitting operation of the light emitting unit 22 and the light receiving operation of the light receiving element 31.
  • it is a pulse signal in which ON (High) and OFF (Low) are repeated at the modulation frequency Fmod. .
  • the light receiving element 31 When the operation mode is the distance measurement mode, the light receiving element 31 generates a depth image storing distance information to the subject based on the light reception result of the reflected light, and outputs it to the control device 11 as measurement data.
  • the operation mode is the 2D image shooting mode
  • the light receiving device 12 when the operation mode is the 2D image shooting mode, the light receiving device 12 generates a brightness image of the subject based on the result of receiving the reflected light, and outputs it to the control device 11 as measurement data.
  • a specific configuration of the light receiving element 31 will be described in detail after FIG.
  • the LD 21 of the illumination device 13 is, for example, a laser driver that drives the light emitting section 22, drives the light emitting section 22 based on the light emission pulse from the light receiving element 31, and causes the light emitting section 22 to output irradiation light.
  • the light emitting unit 22 is composed of, for example, a VCSEL LED (Vertical Cavity Surface Emitting LASER LED) or the like, and emits irradiation light by driving the LD 21 .
  • irradiation light for example, infrared light (IR light) having a wavelength in the range of approximately 850 nm to 940 nm is used.
  • FIG. 2 is a block diagram showing a configuration example of the light receiving element 31 of FIG.
  • the light receiving element 31 includes a control section 51 , a pixel array section 52 , a pulse generation circuit 53 , a tap driving section 54 , a vertical driving section 55 , a column processing section 56 , a horizontal driving section 57 , a signal processing section 58 and an output section 59 .
  • the control unit 51 controls the operation of the light receiving element 31 as a whole. For example, the control unit 51 acquires the operation mode and measurement request supplied from the control device 11 via an input unit (not shown). Further, the control unit 51 instructs the pulse generation circuit 53 to generate a light emission pulse with a predetermined modulation frequency Fmod according to the operation mode. The control unit 51 supplies control signals for performing operations according to the operation mode to the vertical driving unit 55, the column processing unit 56, the horizontal driving unit 57, and the like. Further, the control unit 51 instructs the signal processing unit 58 to perform predetermined signal processing performed by the signal processing unit 58 according to the operation mode, such as depth image generation processing and luminance image generation processing.
  • a plurality of pixels 61 are two-dimensionally arranged in a matrix in the pixel array section 52 .
  • each of the pixels 61 arranged two-dimensionally in the pixel array section 52 includes a photodiode 71 as a photoelectric conversion section that generates an electric charge corresponding to the amount of light received. and two taps (FD73) for accumulating charges generated by the photodiode 71 . These two taps are also called the first tap and the second tap.
  • the pulse generation circuit 53 generates a light emission pulse with a predetermined modulation frequency Fmod under the control of the control unit 51 and outputs it to the LD 21 of the lighting device 13 .
  • the modulation frequency Fmod of the light emission pulse is set to, for example, 20 MHz during low-speed driving, and is set to 100 MHz during high-speed driving. Note that this modulation frequency is merely an example, and is not limited to this numerical value.
  • the pulse generation circuit 53 generates drive signals GDA and GDB corresponding to the light emission pulse with the modulation frequency Fmod and supplies them to the tap drive section 54 .
  • the drive signal GDA is a drive signal for transferring the charge generated by the photoelectric conversion unit of each pixel 61 to the first tap
  • the drive signal GDB transfers the charge generated by the photoelectric conversion unit of each pixel 61 to the first tap. It is a drive signal for transferring to 2 taps.
  • the drive signal GDA and the drive signal GDB are signals in which the phase of the other is inverted with respect to that of the other.
  • the tap drive section 54 distributes the two drive signals GDA and GDB supplied from the pulse generation circuit 53 to generate a drive signal GDA' and a drive signal GDB' for each pixel column. supplied to each pixel column.
  • the tap drive section 54 supplies the drive signal GDA' and the drive signal GDB' for each pixel column of the pixel array section 52, thereby controlling charge distribution to the two taps of each pixel 61.
  • a pixel drive line 63 is wired along the horizontal direction for each pixel row, and two vertical signal lines 64A and 64B are wired along the vertical direction for each pixel column.
  • the pixel drive line 63 transmits a drive signal for driving when reading out the detection signal VSL from each pixel 61 .
  • the pixel drive line 63 is shown as one wiring, but it is not limited to one.
  • One end of the pixel driving line 63 is connected to an output terminal corresponding to each pixel row of the vertical driving section 55 .
  • the vertical signal line 64A is a signal line for transmitting the detection signal VSLA of the first tap to the column processing section 56
  • the vertical signal line 64B is a signal line for transmitting the detection signal VSLB of the second tap to the column processing section 56.
  • the vertical driving section 55 is composed of a shift register, an address decoder, and the like, and drives each pixel 61 of the pixel array section 52 simultaneously or simultaneously for each pixel row through a pixel driving line 63 arranged in the horizontal direction.
  • This is a pixel driving unit that drives in units of rows or the like.
  • the detection signals VSLA and VSLB output from each pixel 61 in the pixel row selectively scanned by the vertical driving section 55 are supplied as pixel signals to the column processing section 56 through the vertical signal lines 64A or 64B.
  • the column processing unit 56 performs predetermined signal processing on pixel signals input from each pixel 61 of the selected row via the vertical signal line 64A or 64B for each pixel column of the pixel array unit 52, and performs signal processing. Subsequent pixel signals are temporarily held. For example, the column processing unit 56 executes AD (analog-to-digital) conversion processing of pixel signals as signal processing.
  • AD analog-to-digital
  • the horizontal driving section 57 is composed of a shift register, an address decoder, etc., and selects unit circuits corresponding to the pixel columns of the column processing section 56 in order. By selective scanning by the horizontal driving unit 57 , the pixel signals processed by the column processing unit 56 are sequentially output to the signal processing unit 58 .
  • the signal processing unit 58 has a predetermined arithmetic processing function, performs predetermined arithmetic processing on the pixel signals output from the column processing unit 56 as necessary, and outputs the signals to the control device via the output unit 59. 11 (FIG. 1).
  • the signal processing unit 58 determines the distance to the subject based on the pixel data (detection signals VSLA and VSLB) for each tap of each pixel 61 supplied from the column processing unit 56. A process of calculating distance information and generating a depth image in which the distance information is stored as the pixel value of each pixel is performed.
  • the signal processing unit 58 performs processing for generating a brightness image of the subject based on pixel data for each tap of each pixel 61 supplied from the column processing unit 56. conduct.
  • the output unit 59 is configured by a predetermined communication interface such as MIPI (Mobile Industry Processor Interface), etc., and outputs the depth image and the luminance image, which are the arithmetic processing results of the signal processing unit 58, to the control device 11 as measurement data. Output.
  • MIPI Mobile Industry Processor Interface
  • the light-receiving element 31 configured as described above performs a light-receiving operation in the operation mode specified by the control device 11 .
  • the light receiving element 31 switches the modulation frequency Fmod of the light emission pulse to at least two frequencies of low speed drive and high speed drive, performs light reception operation, and generates a depth image. and output.
  • the first modulation frequency Fmod1 for high-speed driving is set to 100 MHz, for example
  • the second modulation frequency Fmod2 for low-speed driving is set to 20 MHz, for example.
  • the light receiving element 31 when the 2D image shooting mode is specified as the operation mode, the light receiving element 31 generates and outputs a luminance image of the subject.
  • the illumination light may or may not be emitted at a predetermined modulation frequency Fmod as in the ranging mode.
  • the output of the light emission pulse to the illumination device 13 is stopped.
  • the modulation frequency Fmod for emitting irradiation light in the 2D image capturing mode does not need to be the same as any modulation frequency Fmod for operating in the ranging mode.
  • FIG. 3 shows a circuit configuration example of the pixel 61. As shown in FIG. 3
  • a pixel 61 in FIG. 3 includes a photodiode 71 as a photoelectric conversion unit.
  • the pixel 61 also has two transfer transistors 72, FD (floating diffusion regions) 73, FD gate transistors 74, amplification transistors 75, reset transistors 76, and selection transistors 77 corresponding to two taps. That is, the pixel 61 has transfer transistors 72A and 72B, FDs 73A and 73B, FD gate transistors 74A and 74B, amplification transistors 75A and 75B, reset transistors 76A and 76B, and selection transistors 77A and 77B. Elements marked with are the elements on the first tap side, and elements marked with B are elements on the second tap side.
  • FIG. Pixel 61 also includes a charge drain transistor 78 .
  • Each transistor included in the pixel 61 is composed of an N-type MOSFET.
  • the photodiode 71 generates and accumulates electric charges corresponding to the amount of reflected light received.
  • the transfer transistor 72A becomes conductive in response to this, thereby transferring the charges accumulated in the photodiode 71 to the FD 73A.
  • the transfer transistor 72B becomes conductive in response to this, thereby transferring the charge accumulated in the photodiode 71 to the FD 73B.
  • FD73A is a first-tap charge storage unit that temporarily stores and holds the charge transferred from the photodiode 71 .
  • FD 73B is a second-tap charge storage unit that temporarily stores and holds the charge transferred from the photodiode 71 .
  • the FD gate transistor 74A becomes conductive in response to this.
  • the FD gate transistor 74B becomes conductive in response to this. connect to
  • the storage capacitance can be changed.
  • the amplification transistor 75A is connected to a constant current source (not shown) by connecting the source electrode to the vertical signal line 64A through the selection transistor 77A, thereby forming a source follower circuit.
  • the amplification transistor 75B is connected to a constant current source (not shown) by connecting the source electrode to the vertical signal line 64B via the selection transistor 77B, thereby forming a source follower circuit.
  • the reset transistor 76A resets the potential of the FD 73A by becoming conductive in response to the reset drive signal RST supplied to the gate electrode becoming active.
  • the reset transistor 76B becomes conductive in response to the reset drive signal RST supplied to its gate electrode becoming active, thereby resetting the potential of the FD 73B.
  • reset transistors 76A and 76B are activated, FD gate transistors 74A and 74B are simultaneously activated.
  • there is only one reset driving signal line 82, and the reset driving signal RST is shared by the reset transistors 76A and 76B.
  • Each of the transistors 76A and 76B is individually provided, and ON or OFF is controlled so that each operates exclusively.
  • a predetermined drain voltage RSTDRAIN is supplied to the drains of the reset transistors 76A and 76B.
  • the selection transistor 77A is connected between the amplification transistor 75A and the vertical signal line 64A, becomes conductive when the selection signal SEL supplied to the gate electrode becomes active, and detects the detection signal VSLA output from the amplification transistor 75A. , to the vertical signal line 64A.
  • the selection transistor 77B is connected between the amplification transistor 75B and the vertical signal line 64B, becomes conductive when the selection signal SEL supplied to the gate electrode becomes active, and detects the detection signal VSLB output from the amplification transistor 75B. , to the vertical signal line 64B.
  • a reset operation for resetting the charges of the pixels 61 is performed in all pixels before light reception is performed. That is, the FD gate transistors 74A and 74B and the reset transistors 76A and 76B are turned on to discharge the accumulated charge of the FDs 73A and 73B, and the charge discharge transistor 78 is turned on to discharge the accumulated charge of the photodiode 71. .
  • the light emitting unit 22 After the accumulated charges are discharged, all pixels start receiving light. Specifically, as shown in FIG. 4, the light emitting unit 22 outputs irradiation light that is modulated such that the irradiation is repeatedly turned on and off for an irradiation time T, and a delay time ⁇ T corresponding to the distance to the subject. The reflected light is received by the photodiode 71 with a delay of .
  • the drive signal GDA' controls on/off of the transfer transistor 72A, and the drive signal GDB' controls on/off of the transfer transistor 72B.
  • the drive signal GDA' is, for example, a signal having the same phase as that of the irradiation light, and the drive signal GDB' has an inverted phase of the drive signal GDA'.
  • the charge generated by the photodiode 71 receiving the reflected light is transferred to the FD 73A while the transfer transistor 72A is on according to the driving signal GDA', and the transfer transistor 72A is transferred according to the driving signal GDB'. While 72B is on, it is transferred to FD 73B.
  • the charges transferred via the transfer transistor 72A are sequentially accumulated in the FD 73A, and the charges transferred via the transfer transistor 72B are stored in the FD 73A. It is accumulated sequentially in the FD73B. Additional capacitance is also stored when FD gate transistors 74A and 74B are on.
  • the pixel 61 distributes the charge generated by the reflected light received by the photodiode 71 to the first tap (FD73A) and the second tap (FD73B) according to the delay time ⁇ T. It outputs a detection signal VSLA and a second tap detection signal VSLB.
  • the detection signal VSLA and the detection signal VSLA are affected differently for each pixel 61 due to the characteristic deviation (sensitivity difference) of each element of the pixel transistor such as the photodiode 71 and the transfer transistor 72 of each pixel 61.
  • signal VSLB the characteristic deviation of each element of the pixel transistor such as the photodiode 71 and the transfer transistor 72 of each pixel 61.
  • the sensitivity difference between the taps of each pixel 61 is eliminated by acquiring the detection signal VSLA and the detection signal VSLB that received the reflected light with the same pixel 61 changing the phase. Then, a method to improve the SN ratio is adopted.
  • the light-receiving element 31 receives the reflected light at light-receiving timings shifted by 0°, 90°, 180°, and 270° with respect to the irradiation timing of the irradiation light. receive light. More specifically, the light-receiving element 31 receives light with a phase of 0° with respect to the irradiation timing of irradiation light in a certain frame period, receives light with a phase of 90° in the next frame period, and receives light with a phase of 90° in the next frame period. In a frame period, the reflected light is received with a phase of 180°, and in the next frame period, the phase is changed with a phase of 270°, and the reflected light is received by changing the phase.
  • phase of 0°, 90°, 180°, or 270° represents the phase at the first tap of the pixel 61 unless otherwise specified. Since the second tap has a phase opposite to that of the first tap, when the first tap has a phase of 0°, 90°, 180°, or 270°, the second tap has a phase of 180°, 270°, respectively. °, 0°, or 90° phase.
  • FIG. 6 is a diagram showing the exposure periods of the first tap of the pixel 61 at each phase of 0°, 90°, 180°, and 270° so that the phase difference is easy to understand.
  • the detection signal VSLA obtained by receiving light in the same phase (phase 0°) as the irradiation light is the detection signal A 0
  • the phase (phase 90°) shifted from the irradiation light by 90°. °) is the detection signal A 90
  • the detection signal VSLA obtained by light reception at a phase shifted by 180 degrees from the irradiation light (phase 180 °) is the detection signal A 180
  • the irradiation light is 270 degrees
  • a detection signal VSLA obtained by receiving light with a shifted phase (phase 270°) is called a detection signal A 270 .
  • the detection signal VSLB obtained by receiving light at the same phase (phase 0°) as the irradiation light at the second tap is designated as the detection signal B 0 , and the phase shifted by 90° from the irradiation light (phase 90°).
  • the detection signal VSLB obtained by receiving light at a phase shifted by 180 degrees from the irradiation light (phase 180 degrees) is the detection signal B 180
  • the detection signal VSLB obtained by light reception at a phase shifted by 180 degrees from the irradiation light is detection signal B 180.
  • a detection signal VSLB obtained by receiving light with a shifted phase (phase 270°) is called a detection signal B 270 .
  • FIG. 7 is a diagram explaining how to calculate the depth value and reliability by the 2-Phase method and the 4-Phase method.
  • the depth value d can be obtained by the following formula (1).
  • Equation (1) c is the speed of light, ⁇ T is the delay time, and f is the modulation frequency Fmod of light.
  • ⁇ in Equation (1) represents the amount of phase shift [rad] of the reflected light and is expressed by the following Equation (2).
  • I and Q in Equation (2) convert detection signals A 0 to A 270 and detection signals B 0 to B 270 obtained by setting the phases to 0°, 90°, 180°, and 270°. is calculated by the following formula (3).
  • I and Q are signals obtained by converting the phase of the sine wave from polar coordinates to a rectangular coordinate system (IQ plane), assuming that the luminance change of the irradiation light is a sine wave.
  • I and Q in Equation (2) can be calculated using detection signals with two phases of 0° and 90°. That is, I and Q of the formula (2) in the 2Phase method are given by the following formula (4).
  • the 2-Phase method cannot eliminate characteristic variations between taps that exist in each pixel, it is possible to obtain the depth value d to the object using only the detection signals of two phases. It is possible to measure the distance at a rate. Characteristic variations between taps can be adjusted by correction parameters such as gain and offset, for example.
  • the reliability cnf can be obtained by the following equation (5) in both the 2Phase method and the 4Phase method. As can be seen from Equation (5), the reliability cnf corresponds to the magnitude of reflected light received by the pixel 61, that is, luminance information (luminance value).
  • each pixel 61 of the pixel array section 52 outputs pixel data (detection signal) of one phase such as 0°, 90°, 180°, or 270° is referred to as one frame (period).
  • one frame a unit in which each pixel 61 of the pixel array section 52 outputs pixel data (detection signal) of one phase such as 0°, 90°, 180°, or 270° is referred to as one frame (period).
  • one depth image can be generated with four frames corresponding to four phases using one modulation frequency Fmod.
  • One depth image can be generated from two corresponding frames.
  • the aliasing problem occurs. That is, in indirect ToF distance measurement, since the phase difference is detected and converted into distance, the maximum measurement range is determined according to the modulation frequency Fmod of the light emitting unit 22, and when the maximum measurement distance is exceeded, the detected position The phase difference starts again from zero. As a result, for example, when the modulation frequency Fmod is 100 MHz, 1.5 m and 3 m cannot be distinguished. Therefore, it is necessary to perform distance measurement with a plurality of modulation frequencies Fmod such as 100 MHz and 20 MHz. In the 4-Phase method, one modulation frequency Fmod requires four frames of received light data. Therefore, when performing distance measurement with a plurality of modulation frequencies Fmod, at least eight frames are required to output one depth image. It is necessary to emit light and receive light.
  • the light receiving element 31 receives irradiation light at a high-speed first modulation frequency Fmod1 and at a second modulation frequency Fmod2 lower than the first modulation frequency. generates one depth image and outputs it to the control device 11 as measurement data.
  • the light receiving element 31 receives the irradiation light at the second modulation frequency Fmod2 which is on the low speed side, thereby obtaining the reliability cnf as the brightness information described above.
  • a luminance image is generated by calculating and stored as pixel values, and is output to the control device 11 as measurement data.
  • the light receiving element 31 does not use the reliability cnf as the pixel value, but uses the AD conversion data of the detection signal VSLA of the first tap and the A luminance image may be generated by storing data obtained by summing the AD conversion data of the detection signal VSLB as pixel values, and may be output to the control device 11 as measurement data.
  • the light receiving element 31 can perform control to change the charge accumulation amount of the photodiode 71 according to the operation mode and the magnitude of the amount of light received.
  • FIG. 8 is a diagram illustrating driving of the transfer transistor 72 when the pixel 61 operates at high speed, in other words, when the pixel 61 operates at the first modulation frequency Fmod1.
  • PD represents the photodiode 71
  • TGA represents the transfer transistor 72A on the first tap side
  • FDA represents the FD73A which is the first tap
  • TGB represents the transfer transistor 72B on the second tap side
  • FDB represents the FD 73B which is the second tap.
  • the diagram on the left side of FIG. 8 shows a state in which both transfer transistors 72A and 72B are controlled to be off.
  • a predetermined voltage VSS is supplied to the gate electrodes of the transfer transistors 72A and 72B as an off-voltage TGL for turning off the transfer transistor 72A.
  • the charge amount that can be stored in the photodiode 71 in this state is PDQs.
  • the central diagram in FIG. 8 shows a state in which the transfer transistor 72A on the first tap side is controlled to be ON.
  • the tap drive unit 54 applies a predetermined voltage VDD (>VSS) higher than the voltage VSS as the ON voltage TGH to the gate electrode of the transfer transistor 72A.
  • VDD higher than the voltage VSS as the ON voltage TGH
  • the diagram on the right side of FIG. 8 shows a state in which the transfer transistor 72B on the second tap side is controlled to be ON.
  • the tap drive unit 54 applies a voltage VDD (>VSS) higher than the voltage VSS as an on-voltage TGH to the gate electrode of the transfer transistor 72B.
  • VDD >VSS
  • FIG. 9 is a diagram for explaining the driving of the transfer transistor 72 during low-speed operation compared to high-speed operation.
  • the off-voltage TGL for turning off the transfer transistor 72 is the voltage VSS1 lower than the voltage VSS during high-speed operation. ( ⁇ VSS).
  • the barrier when the transfer transistors 72A and 72B are turned off is increased, so that the amount of charge PDQs that can be stored in the photodiode 71 can be made larger than during high-speed operation.
  • the on-voltage TGH for turning on the transfer transistor 72 during low-speed operation is the same voltage VDD as during high-speed operation.
  • the operation mode is the ranging mode, and the difference in applied voltage when the transfer transistor is turned off between high-speed operation and low-speed operation has been explained.
  • the applied voltage when the transfer transistor is turned off is changed to a voltage VSS1 lower than that during high-speed operation.
  • FIG. 10 is a table summarizing cases in which the voltage applied to the light-receiving element 31 when the transfer transistor is turned off is changed to a voltage VSS1 lower than that during high-speed operation.
  • the tap drive unit 54 of the light receiving element 31 controls to set the off voltage TGL of the transfer transistor 72 to the voltage VSS when the operation mode is the distance measurement mode and the light receiving operation is performed at the high-speed first modulation frequency Fmod1. do.
  • the control unit 51 sets the off voltage TGL of the transfer transistor 72 to the voltage VSS1 lower than the voltage VSS. to control.
  • the tap drive unit 54 controls to set the off voltage TGL of the transfer transistor 72 to a voltage VSS1 lower than the voltage VSS regardless of the modulation frequency Fmod.
  • the tap drive unit 54 turns off the transfer transistor 72 when it is determined that the amount of light received by the pixel array unit 52 is large.
  • the voltage TGL is controlled to be set to a voltage VSS1 lower than the voltage VSS.
  • Whether or not the amount of light received by the pixel array section 52 is large is determined, for example, as follows. For example, when the pixel data of the first frame is acquired, and the pixel data of a predetermined number or more or a predetermined ratio or more of the number of pixels from which the pixel data in the pixel array section 52 are acquired indicates a saturated value, It can be determined that the amount of received light is large. Alternatively, when saturation of pixel data occurs in an area of N ⁇ M pixels or more (N>1, M>1) in the pixel array section 52 from which pixel signals are acquired, it is determined that the amount of light received is large. You may
  • FIG. 11 is a block diagram showing a configuration example of an off-voltage control section that controls the off-voltage TGL of the transfer transistor 72 according to the operation mode or the like.
  • the off-voltage control section 101 can be provided in the tap driving section 54 as shown in FIG. 11, for example.
  • the tap drive unit 54 distributes the two tap drive signals GDA and GDB supplied from the pulse generation circuit 53 to generate the drive signal GDA' and the drive signal GDB' for each pixel column, and the pixel array unit Each of the 52 pixel columns is supplied.
  • the off-voltage control unit 101 switches between the voltage VSS and the voltage VSS1 as the low level of the drive signals GDA′ and GDB′, that is, the off-voltage TGL of the transfer transistor 72 .
  • the High level of drive signals GDA' and GDB' is voltage VDD.
  • the off voltage control section 101 is composed of a circuit including an OR circuit 111 , an inverter 112 , and transistors 113 and 114 .
  • Transistors 113 and 114 are N-type MOSFETs, and constitute a switch that switches between voltage VSS1 and voltage VSS as an off-control voltage for turning off transfer transistor 72 .
  • An operation mode determination signal, a frequency determination signal, and a high light intensity determination signal are input to the OR circuit 111 .
  • the OR circuit 111 performs a logical sum operation of the operation mode determination signal, the frequency determination signal, and the high light intensity determination signal, and outputs a high (1) or low (0) signal as the operation result to the gate electrode of the transistor 113. , to the inverter 112 .
  • the operation mode determination signal is a signal that is High (1) when the operation mode is the 2D image shooting mode and Low (0) when the operation mode is the ranging mode, and is supplied from the control unit 51, for example.
  • the frequency determination signal is a signal that becomes High (1) when the modulation frequency Fmod is equal to or less than a predetermined threshold frequency Fmod_TH, and becomes Low (0) when it is greater than the threshold frequency Fmod_TH.
  • the frequency determination signal can be generated, for example, by referring to a specific address signal that is always fixed to High or Low when the frequency divider in the pulse generation circuit 53 divides the frequency by X or more.
  • the high light intensity determination signal is a signal that becomes High (1) when the amount of received light is large and becomes Low (0) when the amount of light received is not large.
  • the signal processing unit 58 outputs the result of determining whether or not a predetermined number or more or a predetermined ratio or more of pixel data is saturated with respect to the number of pixels from which pixel data is acquired in the pixel array unit 52 as a high light amount determination signal. It is supplied to the off voltage control section 101 . Factors that cause the amount of received light to be large include a short distance to the subject, a large amount of emitted light, and a high reflectance of the subject.
  • the output of the OR circuit 111 is directly supplied to the gate electrode of the transistor 113, and is supplied to the gate electrode of the transistor 114 via the inverter 112. Therefore, when the transistor 113 is on, the transistor 114 is off, and the transistor 114 is off. When 113 is off, transistor 114 is on.
  • a voltage VSS1 is supplied to the source of the transistor 113 .
  • a voltage VSS (>VSS1) is supplied to the source of the transistor 114 .
  • Voltages VSS and VSS1 may be provided exclusively for off-voltage control of transfer transistor 72, or may be shared with other controls.
  • the voltage VSS can be shared with the ground voltage applied to the Pwell of the semiconductor substrate, and the voltage VSS1 can be shared with the off control voltage of other pixel transistors (eg, select transistor 77).
  • the driving signal GDA' and the Low level of GDB' is controlled to voltage VSS1.
  • the operation mode is the 2D image capturing mode
  • the modulation frequency Fmod is equal to or less than the predetermined threshold frequency Fmod_TH, or when the amount of received light is large
  • the driving signal GDA' and the Low level of GDB' is controlled to voltage VSS1.
  • the operation mode is the ranging mode
  • the modulation frequency Fmod is greater than the predetermined threshold frequency Fmod_TH, and when the amount of light received is not large
  • the drive signals GDA' and GDB' are at the Low level. is controlled to the voltage VSS.
  • the off-voltage control processing performed by the off-voltage control unit 101 will be described with reference to the flowchart of FIG. 12 .
  • This processing is started, for example, when a measurement request is supplied from the control device 11 to the light receiving device 12 and the light receiving element 31 starts light receiving operation.
  • the initial value of the off-voltage TGL of the transfer transistor 72 is set to the voltage VSS corresponding to high-speed driving.
  • step S1 the off-voltage control unit 101 determines whether the operation mode is the 2D image shooting mode. If it is determined in step S1 that the operation mode is the 2D image shooting mode, the process proceeds to step S4, which will be described later.
  • step S1 determines whether the operation mode is not the 2D image capturing mode but the operation mode is the ranging mode. If it is determined in step S1 that the operation mode is not the 2D image capturing mode but the operation mode is the ranging mode, the process proceeds to step S2, and the off-voltage control unit 101 sets the modulation frequency Fmod at a low speed. Yes, in other words, it is determined whether the modulation frequency Fmod is equal to or lower than a predetermined threshold frequency Fmod_TH. If it is determined in step S2 that the modulation frequency Fmod is low, the process proceeds to step S4, which will be described later.
  • step S3 the off voltage control section 101 determines whether the amount of received light is large. Since the determination in step S3 requires acquisition of the depth value d for at least one frame, if it is determined in step S2 that the modulation frequency Fmod is high, the light receiving element 31 turns off the transfer transistor 72. The light receiving operation is performed with the voltage TGL at the voltage VSS set as the initial value. Then, the off-voltage control unit 101 determines whether the amount of light received is large based on the light reception result of one frame. When a predetermined number or more or a predetermined ratio or more of pixel data is saturated with respect to the number of pixels from which pixel data is acquired in the pixel array section 52, it is determined that the amount of received light is large.
  • step S3 If it is determined in step S3 that the amount of received light is large, the process proceeds to step S4, and if it is determined that the amount of received light is not large, the process proceeds to step S5.
  • step S4 the off-voltage control unit 101 controls the off-voltage TGL of the transfer transistor 72 to be set to a voltage VSS1 lower than the voltage VSS.
  • step S5 the off-voltage control unit 101 controls the off-voltage TGL of the transfer transistor 72 to be set to the voltage VSS.
  • the off-voltage control process of FIG. 12 can be repeatedly executed while the light receiving operation continues.
  • the off-voltage TG L of the transfer transistor 72 is set to the default (initial value) voltage VSS and controlled to shorten the transfer time.
  • the modulation frequency Fmod of the light emission pulse is equal to or less than the predetermined threshold frequency Fmod_TH
  • the operation mode is the 2D image capturing mode, or when the amount of received light is large
  • the off voltage TG of the transfer transistor 72 L is set to a voltage VSS1 that is lower than voltage VSS.
  • the charge storage amount of the photodiode 71 can be increased, and saturation can be prevented.
  • the charge storage amount of the photodiode 71 can be increased by setting the off-voltage TGL of the transfer transistor 72 to the voltage VSS1, there is also an idea that it should always be set to the voltage VSS1.
  • the operating power is proportional to the operating frequency, even if the operating voltage range is slightly widened, the power consumption of low-speed operation exceeds the power consumption of high-speed operation by suppressing the modulation frequency Fmod. no.
  • the light-receiving element 31 described above has two transfer transistors 72 in the pixel 61, one or three or more transfer transistors 72 may be arranged in one pixel.
  • the same off-voltage control as described above can be applied to one transfer transistor 72 for one pixel or four transfer transistors 72 for one pixel, such as an image sensor for viewing.
  • the detection signal VSLA of the first tap and the detection signal VSLB of the second tap are obtained in separate frames.
  • the light receiving element 31 having four transfer transistors 72 in one pixel it can be operated so as to obtain four phase detection signals VSL of 0°, 90°, 180° and 270° in one frame.
  • the off control voltage for the transfer transistor 72 may have three or more voltage values.
  • three types of OFF control voltages may be switched.
  • the voltage VSS may be used for the distance measurement mode
  • the voltages VSS1 and VSS2 may be used for the 2D image capturing mode
  • the voltages VSS1 and VSS2 may be further switched according to the type of scene in the 2D image capturing mode.
  • the off-voltage control unit 101 is provided in the tap drive unit 54, but the off-voltage control unit 101 extends from the pulse generation circuit 53 that generates the drive signals GDA and GDB to the tap drive unit 54. may be placed anywhere along the path between
  • the distance measuring system 1 described above can be installed in electronic devices such as smartphones, tablet terminals, mobile phones, personal computers, game machines, television receivers, wearable terminals, digital still cameras, and digital video cameras.
  • FIG. 13 is a block diagram showing a configuration example of a smartphone as an electronic device equipped with the ranging system of the present disclosure.
  • the smartphone 201 includes a ranging module 202 , an imaging device 203 , a display 204 , a speaker 205 , a microphone 206 , a communication module 207 , a sensor unit 208 , a touch panel 209 , and a control unit 210 via a bus 211 . connected and configured.
  • the control unit 210 also has functions as an application processing unit 221 and an operating system processing unit 222 by executing programs by the CPU.
  • the light receiving device 12 and lighting device 13 of the ranging system 1 in FIG. 1 are applied to the ranging module 202 .
  • the distance measurement module 202 is arranged on the front surface of the smartphone 201, and performs distance measurement for the user of the smartphone 201, thereby obtaining the depth value of the surface shape of the user's face, hands, fingers, etc. as a distance measurement result. can be output as The functions of the controller 11 of the ranging system 1 of FIG. 1 can be performed by the control unit 210 .
  • the imaging device 203 is arranged in front of the smartphone 201 and captures an image of the user of the smartphone 201 as a subject to obtain an image of the user.
  • the ranging module 202 and the imaging device 203 may be arranged on the back side of the smartphone 201, or may be arranged on both the front side and the back side.
  • the display 204 displays an operation screen for processing by the application processing unit 221 and the operation system processing unit 222, an image captured by the imaging device 203, and the like.
  • the speaker 205 and the microphone 206 output the voice of the other party and pick up the voice of the user, for example, when making a call using the smartphone 201 .
  • the communication module 207 performs communication via a communication network.
  • the sensor unit 208 senses speed, acceleration, proximity, and the like, and the touch panel 209 acquires a user's touch operation on the operation screen displayed on the display 204 .
  • the application processing unit 221 performs processing for providing various services by the smartphone 201.
  • the application processing unit 221 can create a computer graphics face that virtually reproduces the user's facial expression based on the depth image supplied from the distance measurement module 202 and display the face on the display 204 .
  • the application processing unit 221 can perform processing for creating three-dimensional shape data of an arbitrary three-dimensional object, for example, based on the depth image supplied from the distance measurement module 202 .
  • the operation system processing unit 222 performs processing for realizing the basic functions and operations of the smartphone 201 .
  • the operation system processing unit 222 can authenticate the user's face and unlock the smartphone 201 based on the depth image supplied from the ranging module 202 .
  • the operation system processing unit 222 performs, for example, a process of recognizing a user's gesture based on the depth image supplied from the distance measurement module 202, and performs a process of inputting various operations according to the gesture. can be done.
  • distance measurement information can be calculated and a two-dimensional luminance image can be generated. Also, the accuracy of the ranging information can be improved.
  • the technology (the present technology) according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be realized as a device mounted on any type of moving body such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, and robots. may
  • FIG. 14 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • a vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an exterior information detection unit 12030, an interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output unit 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the driving system control unit 12010 includes a driving force generator for generating driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism to adjust and a brake device to generate braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices equipped on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, winkers or fog lamps.
  • the body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives the input of these radio waves or signals and controls the door lock device, power window device, lamps, etc. of the vehicle.
  • the vehicle exterior information detection unit 12030 detects information outside the vehicle in which the vehicle control system 12000 is installed.
  • the vehicle exterior information detection unit 12030 is connected with an imaging section 12031 .
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as people, vehicles, obstacles, signs, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electric signal as an image, and can also output it as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • the in-vehicle information detection unit 12040 is connected to, for example, a driver state detection section 12041 that detects the state of the driver.
  • the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing off.
  • the microcomputer 12051 calculates control target values for the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and controls the drive system control unit.
  • a control command can be output to 12010 .
  • the microcomputer 12051 realizes the functions of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. Cooperative control can be performed for the purpose of ADAS (Advanced Driver Assistance System) including collision avoidance or shock mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, etc. based on the information about the vehicle surroundings acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver's Cooperative control can be performed for the purpose of autonomous driving, etc., in which vehicles autonomously travel without depending on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the information detection unit 12030 outside the vehicle.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control aimed at anti-glare such as switching from high beam to low beam. It can be carried out.
  • the audio/image output unit 12052 transmits at least one of audio and/or image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 15 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as the front nose of the vehicle 12100, the side mirrors, the rear bumper, the back door, and the upper part of the windshield in the vehicle interior, for example.
  • An image pickup unit 12101 provided in the front nose and an image pickup unit 12105 provided above the windshield in the passenger compartment mainly acquire images in front of the vehicle 12100 .
  • Imaging units 12102 and 12103 provided in the side mirrors mainly acquire side images of the vehicle 12100 .
  • An imaging unit 12104 provided in the rear bumper or back door mainly acquires an image behind the vehicle 12100 .
  • Forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 15 shows an example of the imaging range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of an imaging unit 12104 provided on the rear bumper or back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera composed of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and changes in this distance over time (relative velocity with respect to the vehicle 12100). , it is possible to extract, as the preceding vehicle, the closest three-dimensional object on the course of the vehicle 12100, which runs at a predetermined speed (for example, 0 km/h or more) in substantially the same direction as the vehicle 12100. can. Furthermore, the microcomputer 12051 can set the inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including following stop control) and automatic acceleration control (including following start control). In this way, cooperative control can be performed for the purpose of automatic driving in which the vehicle runs autonomously without relying on the operation of the driver.
  • automatic brake control including following stop control
  • automatic acceleration control including following start control
  • the microcomputer 12051 converts three-dimensional object data related to three-dimensional objects to other three-dimensional objects such as motorcycles, ordinary vehicles, large vehicles, pedestrians, and utility poles. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see. Then, the microcomputer 12051 judges the collision risk indicating the degree of danger of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, an audio speaker 12061 and a display unit 12062 are displayed. By outputting an alarm to the driver via the drive system control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not the pedestrian exists in the captured images of the imaging units 12101 to 12104 .
  • recognition of a pedestrian is performed by, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian.
  • the audio image output unit 12052 outputs a rectangular outline for emphasis to the recognized pedestrian. is superimposed on the display unit 12062 . Also, the audio/image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to the vehicle exterior information detection unit 12030 and the vehicle interior information detection unit 12040 among the configurations described above.
  • a process of recognizing the driver's gesture is performed, and various types (for example, , audio system, navigation system, air conditioning system) and more accurately detect the driver's condition.
  • various types for example, , audio system, navigation system, air conditioning system
  • the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
  • the configuration described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit).
  • part of the configuration of one device (or processing unit) may be included in the configuration of another device (or other processing unit) as long as the configuration and operation of the system as a whole are substantially the same. .
  • a system means a set of multiple components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a single device housing a plurality of modules in one housing, are both systems. .
  • the technique of this disclosure can take the following configurations.
  • a pixel having a photoelectric conversion unit and a transfer transistor that transfers charges generated by the photoelectric conversion unit;
  • a light-receiving element comprising: a voltage control section that switches and controls a plurality of voltage values as an off-control voltage for turning off the transfer transistor.
  • the pixels are two said transfer transistors, first and second transfer transistors; a first charge storage unit that holds the charge transferred by the first transfer transistor; a second charge storage unit that holds the charge transferred by the second transfer transistor;
  • the photoelectric conversion unit receives and photoelectrically converts reflected light emitted from a predetermined light source and reflected by an object.
  • the voltage control unit switches between a first voltage and a second voltage lower than the first voltage as the off-control voltages for the first and second transfer transistors.
  • Light receiving element (3) the first and second transfer transistors are alternately turned on at a predetermined modulation frequency; (2) above, wherein the voltage control unit switches the off-control voltages of the first and second transfer transistors to the first voltage or the second voltage according to the predetermined modulation frequency light receiving element.
  • the predetermined modulation frequencies include a first modulation frequency and a second modulation frequency slower than the first modulation frequency; The voltage control unit switches the off-control voltage to the first voltage when the modulation frequency is the first modulation frequency, and switches the off-control voltage to the second voltage when the modulation frequency is the second modulation frequency.
  • the light-receiving element according to (3) wherein the light-receiving element is switched to a voltage.
  • the voltage control unit switches the OFF control voltage to the first voltage when the predetermined modulation frequency is greater than a predetermined threshold frequency, and switches the OFF control voltage to the first voltage when the predetermined modulation frequency is less than or equal to a predetermined threshold frequency. In addition, the off-control voltage is switched to the second voltage.
  • the voltage control unit switches the off-control voltages of the first and second transfer transistors to the first voltage or the second voltage according to the operation mode of (2) to (5).
  • the light receiving element according to any one of the above.
  • the voltage control unit switches the off-control voltage to the first voltage when the operation mode is a ranging mode for measuring a distance, and the operation mode is 2D image capturing for generating a two-dimensional luminance image.
  • the voltage control unit switches the off-control voltages of the first and second transfer transistors to the first voltage or the second voltage according to the amount of received light.
  • the light receiving element according to any one of the above.
  • the voltage control unit switches the off-control voltage to the first voltage when it is determined that the amount of received light is not large, and when it is determined that the amount of received light is large,
  • (10) The above (9), wherein when it is determined that the amount of received light is large, a predetermined number or more or a predetermined ratio or more of pixel data with respect to the number of pixels from which pixel data is acquired is saturated.
  • Light receiving element (11) The voltage control unit according to any one of (1) to (10) above, including a transistor that switches between a first voltage and a second voltage lower than the first voltage as the off-control voltage.
  • a light-receiving element comprising a pixel having a photoelectric conversion unit and a transfer transistor for transferring charges generated by the photoelectric conversion unit,
  • a method of driving a light-receiving element comprising switching a plurality of voltage values as an off-control voltage for turning off the transfer transistor.
  • a light emitting unit that emits irradiation light
  • a light-receiving element that receives the reflected light returned by the irradiation light reflected by an object
  • the light receiving element is a pixel having a photoelectric conversion unit and a transfer transistor that transfers charges generated by the photoelectric conversion unit;
  • a distance measuring system comprising: a voltage control section that switches and controls a plurality of voltage values as an off-control voltage for turning off the transfer transistor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

La présente divulgation concerne : un élément de réception de lumière apte à améliorer la précision de mesure de distance lorsque la vitesse est faible, tout en maintenant une efficacité de transfert pendant un fonctionnement à grande vitesse ; un procédé de commande associé ; et un système de mesure de distance. L'élément de réception de lumière comprend : des pixels qui ont chacun une unité de conversion photoélectrique et un transistor de transfert qui transfère une charge produite par l'unité de conversion photoélectrique ; et une unité de contrôle de tension qui effectue une commande de commutation entre une tension VSS et une tension VSS1 en tant que des tensions de contrôle d'arrêt pour éteindre les transistors de transfert. La présente divulgation peut être appliquée, par exemple, à un élément de réception de lumière qui mesure la distance à un sujet en émettant une lumière de rayonnement vers le sujet et en recevant une lumière réfléchie par le sujet.
PCT/JP2022/004585 2021-06-02 2022-02-07 Élément de réception de lumière, procédé de commande associé et système de mesure de distance WO2022254792A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-092650 2021-06-02
JP2021092650A JP2022185168A (ja) 2021-06-02 2021-06-02 受光素子およびその駆動方法、並びに、測距システム

Publications (1)

Publication Number Publication Date
WO2022254792A1 true WO2022254792A1 (fr) 2022-12-08

Family

ID=84324104

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/004585 WO2022254792A1 (fr) 2021-06-02 2022-02-07 Élément de réception de lumière, procédé de commande associé et système de mesure de distance

Country Status (2)

Country Link
JP (1) JP2022185168A (fr)
WO (1) WO2022254792A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100157120A1 (en) * 2008-12-19 2010-06-24 Compton John T Image sensor with controllable transfer gate off state voltage levels
JP2015201733A (ja) * 2014-04-07 2015-11-12 キヤノン株式会社 撮像装置及びその制御方法
US20180191980A1 (en) * 2016-12-29 2018-07-05 Samsung Electronics Co., Ltd. Image sensors and electronic apparatuses including the same
JP2019164039A (ja) * 2018-03-20 2019-09-26 ソニーセミコンダクタソリューションズ株式会社 距離センサおよび距離測定装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100157120A1 (en) * 2008-12-19 2010-06-24 Compton John T Image sensor with controllable transfer gate off state voltage levels
JP2015201733A (ja) * 2014-04-07 2015-11-12 キヤノン株式会社 撮像装置及びその制御方法
US20180191980A1 (en) * 2016-12-29 2018-07-05 Samsung Electronics Co., Ltd. Image sensors and electronic apparatuses including the same
JP2019164039A (ja) * 2018-03-20 2019-09-26 ソニーセミコンダクタソリューションズ株式会社 距離センサおよび距離測定装置

Also Published As

Publication number Publication date
JP2022185168A (ja) 2022-12-14

Similar Documents

Publication Publication Date Title
US11509840B2 (en) Solid-state imaging device, signal processing chip, and electronic apparatus
US11102433B2 (en) Solid-state imaging device having a photoelectric conversion element with multiple electrodes
US20200057149A1 (en) Optical sensor and electronic device
WO2021085128A1 (fr) Dispositif de mesure de distance, procédé de mesure, et système de mesure de distance
WO2020158401A1 (fr) Dispositif de réception de lumière et système de télémétrie
US11582407B2 (en) Solid-state imaging apparatus and driving method thereof
US11937001B2 (en) Sensor and control method
US20210293958A1 (en) Time measurement device and time measurement apparatus
WO2020166419A1 (fr) Dispositif de réception de lumière, procédé de génération d'histogramme et système de mesure de distance
TWI757751B (zh) 測距感測器、信號處理方法及測距模組
WO2021010174A1 (fr) Dispositif de réception de lumière et procédé de commande de dispositif de réception de lumière
US20240193908A1 (en) Information processing apparatus and information processing method
WO2022254792A1 (fr) Élément de réception de lumière, procédé de commande associé et système de mesure de distance
US20230341520A1 (en) Light receiving device, drive control method therefor, and distance measuring device
WO2021153299A1 (fr) Élément d'imagerie et module de mesure de distance
WO2020166349A1 (fr) Dispositif de réception de lumière, procédé de génération d'histogramme et système de télémétrie
US20230417920A1 (en) Ranging sensor, ranging system, and electronic device
US20230228875A1 (en) Solid-state imaging element, sensing system, and control method of solid-state imaging element
US20230352512A1 (en) Imaging element, imaging device, electronic equipment
WO2024024469A1 (fr) Dispositif de détection de lumière
WO2024090031A1 (fr) Élément d'imagerie et dispositif électronique
WO2020050007A1 (fr) Dispositif d'imagerie à semi-conducteur et dispositif électronique
JP2024067906A (ja) 光検出装置
JP2022007152A (ja) 光検出装置および測距システム
CN116940893A (zh) 摄像装置和摄像系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22815551

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22815551

Country of ref document: EP

Kind code of ref document: A1