WO2021131684A1 - Dispositif de télémétrie, procédé de commande de dispositif de télémétrie et appareil électronique - Google Patents

Dispositif de télémétrie, procédé de commande de dispositif de télémétrie et appareil électronique Download PDF

Info

Publication number
WO2021131684A1
WO2021131684A1 PCT/JP2020/045758 JP2020045758W WO2021131684A1 WO 2021131684 A1 WO2021131684 A1 WO 2021131684A1 JP 2020045758 W JP2020045758 W JP 2020045758W WO 2021131684 A1 WO2021131684 A1 WO 2021131684A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
light
light source
unit
distance
Prior art date
Application number
PCT/JP2020/045758
Other languages
English (en)
Japanese (ja)
Inventor
ジャエシュ ハナーカル
竜生 諸角
陽太郎 安
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2021131684A1 publication Critical patent/WO2021131684A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems

Definitions

  • the present technology relates to a distance measuring device and its control method, and an electronic device, in particular, a distance measuring device and its control method capable of measuring a distance limited to a desired measurement range, and an electronic device. Regarding.
  • a distance measuring module is mounted on a mobile terminal such as a smartphone.
  • a distance measuring method in the distance measuring module for example, there is a method called a ToF (Time of Flight) method.
  • ToF Time of Flight
  • light is emitted toward an object to detect the light reflected on the surface of the object, and the distance to the object is calculated based on the measured value obtained by measuring the flight time of the light (for example,). See Patent Document 1).
  • ToF type distance measuring device With the ToF type distance measuring device, there is a request to measure the distance limited to the desired measurement range.
  • This technology was made in view of such a situation, and makes it possible to measure a distance limited to a desired measurement range.
  • the distance measuring device on the first side surface of the present technology determines a light emitting source that irradiates the irradiation light, a light receiving sensor that receives the reflected light that is reflected by the object and returned, and a light emitting timing of the light emitting source.
  • a coded light source modulated signal and a coded sensor modulated signal are generated by encoding the controlled light source modulated signal and the sensor modulated signal that controls the light receiving timing of the light receiving sensor corresponding to a predetermined code.
  • the control method of the distance measuring device on the second side of the present technology is a distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned.
  • the coded light source modulated signal is obtained by encoding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • a delay sensor modulated signal whose phase is delayed by the amount of the delay of is generated.
  • the electronic device on the third aspect of the present technology controls a light emitting source that irradiates the irradiation light, a light receiving sensor that receives the reflected light that is reflected by the object and returned, and a light emitting timing of the light emitting source.
  • a coded light source modulated signal and a coded sensor modulated signal are generated by encoding the light source modulated signal to be generated and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • a distance measuring device including a sensor delay unit that generates a delay sensor modulated signal whose phase is delayed by an amount is provided.
  • a distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned.
  • the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor are coded as a coded light source modulated signal by coding corresponding to a predetermined code.
  • a sensor modulation signal is generated, and a delay light source modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded light source modulation signal, or a predetermined delay amount with respect to the coded sensor modulation signal.
  • a delay sensor modulated signal with only a phase delay is generated.
  • the distance measuring device and the electronic device may be an independent device or a module incorporated in another device.
  • the relationship between the distance to the object and the signal strength in the normal mode and the code period 1 is shown.
  • the relationship between the distance to the object and the signal strength in the normal mode and the code period 2 is shown.
  • the relationship between the distance to the object and the signal strength in the long-distance mode and the code period 1 is shown.
  • the relationship between the distance to the object and the signal strength in the long-distance mode and the code period 2 is shown.
  • the relationship between the distance to the object and the signal strength in the short-distance mode and the code period 1 is shown.
  • the relationship between the distance to the object and the signal strength in the short-distance mode and the code period 2 is shown.
  • the relationship between the distance to the object and the signal strength in the short-distance mode and the code period 1 is shown.
  • FIG. 1 is a block diagram showing a configuration example of a distance measuring device according to an embodiment to which the present technology is applied.
  • the distance measuring device 1 shown in FIG. 1 is a distance measuring module that performs distance measuring by the Indirect ToF method, irradiates a predetermined object (measurement object) as a subject with light, and the light (irradiation light). Receives the light (reflected light) reflected by the object to generate and output a depth map and a reliability map as distance information to the object.
  • the distance measuring device 1 includes a timing signal generation unit 11, a phase setting unit 12, a light source modulation unit 13, a sensor modulation unit 14, a code generation unit 15, a coding unit 16, a light source delay unit 17, a sensor delay unit 18, and a light emitting source 19. , A light receiving sensor 20, and a control unit 21.
  • the timing signal generation unit 11 generates a timing signal that serves as a reference for the light emitting operation of the light emitting source 19 and the light receiving operation of the light receiving sensor 20. Specifically, the timing signal generation unit 11 generates a modulation signal having a predetermined modulation frequency Fmod (for example, 20 MHz) and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15.
  • the modulated signal is, for example, as shown in FIG. 4, a pulse signal that repeats on (High) and off (Low) at the modulation frequency Fmod.
  • Phase setting unit 12 when performing a distance measurement by Indirect ToF method, the light emission timing of the light emitting source 19, to set the phase difference phi D of the light receiving timing of the light receiving sensor 20, the light source modulation unit 13 and the sensor modulation section 14 Supply to.
  • the phase difference ⁇ D between the light emission timing and the light reception timing is referred to as a drive phase difference ⁇ D to distinguish it from the phase difference ⁇ detected according to the distance to the subject.
  • the light source modulation unit 13 is a light source modulation signal whose phase is shifted by the drive phase difference ⁇ D with respect to the modulation signal supplied from the timing signal generation unit 11. Is generated and supplied to the coding unit 16.
  • Sensor modulation unit 14 when the drive phase difference phi D from the phase setting section 12 is supplied, the modulated signal supplied from the timing signal generator 11, a sensor modulated signal obtained by shifting the drive phase difference phi D phase by Is generated and supplied to the coding unit 16.
  • the phase setting unit 12 Since the phase of the light source modulation signal generated by the light source modulation unit 13 and the sensor modulation signal generated by the sensor modulation unit 14 need only be deviated by the drive phase difference ⁇ D , the phase setting unit 12 has the drive phase difference ⁇ . D may be supplied to either the light source modulation unit 13 or the sensor modulation unit 14. For example, when the phase setting unit 12 supplies the drive phase difference ⁇ D to the light source modulation unit 13, the light source modulation unit 13 generates a light source modulation signal whose phase is shifted by the drive phase difference ⁇ D with respect to the modulation signal.
  • the sensor modulation unit 14 supplies the modulation signal from the timing signal generation unit 11 to the coding unit 16 as it is as a sensor modulation signal.
  • the phase setting unit 12 when supplying a driving phase difference phi D to the sensor modulation section 14, the sensor modulation unit 14 generates a sensor modulated signal obtained by shifting the drive phase difference phi D phase against modulation signal
  • the light source modulation unit 13 supplies the modulation signal from the timing signal generation unit 11 to the coding unit 16 as it is as a light source modulation signal.
  • a modulation signal which is a reference timing signal, is supplied to the code generation unit 15 from the timing signal generation unit 11, and a code period is supplied from the control unit 21.
  • the code generation unit 15 randomly generates a code of 0 or 1 in a code cycle unit supplied from the control unit 21 and supplies the code to the coding unit 16.
  • One code cycle is one cycle of the modulated signal, and the code cycle unit supplied from the control unit 21 is an integral multiple of one cycle of the modulated signal.
  • the code period is also referred to as Chip Length.
  • the coding unit 16 generates a coded signal according to the code supplied from the code generation unit 15 from the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14. To do. With respect to the supplied light source modulation signal or sensor modulation signal, the generated coded signal is a signal having the same phase when the code is 0, and a signal having an inverted phase when the code is 1.
  • the coding unit 16 generates a coded light source modulation signal corresponding to the code supplied from the code generation unit 15 from the light source modulation signal supplied from the light source modulation unit 13, and supplies the coded light source modulation signal to the light source delay unit 17.
  • the coding unit 16 generates a coded sensor modulation signal corresponding to the code supplied from the code generation unit 15 from the sensor modulation signal supplied from the sensor modulation unit 14, and supplies the sensor modulation signal to the sensor delay unit 18.
  • the light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by the delay amount ⁇ D supplied from the control unit 21 with respect to the coded light source modulation signal supplied from the coding unit 16, and the light source 19 Supply to.
  • the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by the delay amount ⁇ D supplied from the control unit 21 with respect to the coded sensor modulation signal supplied from the coding unit 16, and the light receiving sensor 20 Supply to.
  • the light emitting source 19 is composed of, for example, an infrared laser diode as a light source, a laser drive driver, or the like, and emits light while being modulated at a timing corresponding to a delayed light source modulation signal supplied from the light source delay unit 17 to the object. Irradiate the irradiation light.
  • the light receiving sensor 20 which will be described in detail later with reference to FIG. 2, is a pixel array unit 32 in which a plurality of pixels 31 are two-dimensionally arranged in a matrix, and receives reflected light from an object. A pixel signal corresponding to the amount of received reflected light is supplied to the control unit 21.
  • the control unit 21 controls the operation of the entire distance measuring device 1.
  • the control unit 21 includes a timing signal generation unit 11, a phase setting unit 12, a code generation unit 15, and the like according to a distance measurement instruction from the host control unit, which is a control unit of the host device in which the distance measurement device 1 is incorporated. Outputs a trigger signal to start operation. Further, the control unit 21 determines the code period (Chip Length), supplies it to the code generation unit 15, determines the delay amount ⁇ D according to the measurement mode, and determines either the light source delay unit 17 or the sensor delay unit 18. Supply to one side.
  • the code period Chip Length
  • control unit 21 generates a depth value and reliability for each pixel based on the pixel signal supplied from the light receiving sensor 20, and stores the depth value as the pixel value of each pixel, and a depth map of each pixel.
  • a reliability map that stores the reliability as a pixel value is generated and output to the host control unit.
  • the distance measuring device 1 of FIG. 1 has the above configuration.
  • the distance measuring device 1 has a first measurement mode (hereinafter, also referred to as a normal mode) and a second measurement mode (hereinafter, near) that focuses on distance measurement at a shorter distance than the first measurement mode. It also has a distance mode) and a third measurement mode (hereinafter, also referred to as a long distance mode) that focuses on distance measurement at a longer distance than the first measurement mode.
  • a first measurement mode hereinafter, also referred to as a normal mode
  • a second measurement mode hereinafter, near
  • a third measurement mode hereinafter, also referred to as a long distance mode
  • FIG. 2 shows a detailed configuration example of the light receiving sensor 20.
  • the light receiving sensor 20 has a pixel array unit 32 in which the pixels 31 are two-dimensionally arranged in a matrix in the row direction and the column direction, and a drive control circuit 33 arranged in a peripheral region of the pixel array unit 32.
  • the pixel 31 generates an electric charge according to the amount of reflected light received, and outputs a pixel signal corresponding to the electric charge.
  • the pixel 31 includes a photodiode 41 and FD (Floating Diffusion) units 42A and 42B as charge storage units for detecting the charges photoelectrically converted by the photodiode 41.
  • FD Floating Diffusion
  • the FD section 42A is also referred to as a tap A (first tap)
  • the FD section 42B is also referred to as a tap B (second tap).
  • the pixel 31 is a plurality of pixel transistors that control charge accumulation in the FD section 42A as the tap A, the transfer transistor 43A, the selection transistor 44A, the reset transistor 45A, and the charge to the FD section 42B as the tap B. It includes a transfer transistor 43B, a selection transistor 44B, and a reset transistor 45B, which are a plurality of pixel transistors that control storage.
  • a reset operation is performed to reset the excess charge before the start of exposure.
  • the drive control circuit 33 controls the distribution signals GDA and GDB and the reset signals RSA and RSB to High, and the transfer transistor 43A and the reset transistor 45A on the tap A side, the transfer transistor 43B on the tap B side, and the like. Turn on the reset transistor 45B.
  • the transfer transistor 43A and the reset transistor 45A and the transfer transistor 43B and the reset transistor 45B on the tap B side are turned off.
  • the drive control circuit 33 alternately controls the distribution signals GDA and GDB to High, and alternately turns on the transfer transistor 43A on the tap A side and the transfer transistor 43B on the tap B side.
  • the electric charge generated by the photodiode 41 is distributed to the FD section 42A as the tap A or the FD section 42B as the tap B.
  • the operation of distributing the electric charge generated by the photodiode 41 to the tap A or the tap B is periodically repeated for a time corresponding to the light emission period of one frame.
  • the charges transferred via the transfer transistor 43A are sequentially stored in the FD section 42A, and the charges transferred via the transfer transistor 43B are sequentially stored in the FD section 42B.
  • the drive control circuit 33 controls the selection signals ROA and ROB to High, so that the detection signal A corresponding to the accumulated charge of the FD unit 42A which is the tap A and the FD unit which is the tap B
  • the detection signal B corresponding to the accumulated charge of 42B is output as a pixel signal. That is, when the selection transistor 44A is turned on according to the selection signal ROA, the detection signal A corresponding to the amount of electric charge stored in the FD unit 42A is output from the pixel 31 via the signal line 46A. Similarly, when the selection transistor 44B is turned on according to the selection signal ROB, the detection signal B corresponding to the amount of electric charge stored in the FD unit 42B is output from the pixel 31 via the signal line 46B.
  • the pixel 31 distributes the electric charge generated by the reflected light received by the photodiode 41 to the tap A or the tap BB according to the distribution signals GDA and GDB, and outputs the detection signal A and the detection signal B.
  • the depth value d corresponding to the distance from the distance measuring device 1 to the object can be calculated by the following equation (1).
  • ⁇ t in the equation (1) is the time until the irradiation light emitted from the light emitting source 19 is reflected by the object as the subject and is incident on the light receiving sensor 20, and c is the speed of light.
  • pulsed light having a light emitting pattern that repeats on / off at high speed at a modulation frequency Fmod as shown in FIG. 3 is adopted.
  • One cycle T of the light emission pattern is 1 / Fmod.
  • the reflected light (light receiving pattern) is detected out of phase according to the time ⁇ t from the light emitting source 19 to the light receiving sensor 20.
  • the time ⁇ t can be calculated by the following equation (2).
  • the depth value d from the distance measuring device 1 to the object can be calculated from the equations (1) and (2) by the following equation (3).
  • Each pixel 31 of the pixel array unit 32 formed in the light receiving sensor 20 repeats ON / OFF of the transfer transistors 43A and 43B at high speed as described above, and accumulates electric charges only during the ON period.
  • the light receiving sensor 20 sequentially switches the ON / OFF execution timing of each pixel 31 of the pixel array unit 32, for example, in frame units, accumulates electric charges at each execution timing, and outputs a detection signal according to the accumulated electric charge. ..
  • phase 0 degrees phase 90 degrees
  • phase 180 degrees phase 270 degrees.
  • the execution timing of the phase 0 degree is a timing at which the ON timing (light receiving timing) of the tap A or the tap B of each pixel 31 of the pixel array unit 32 is set to the emission timing of the irradiation light, that is, the same phase as the emission pattern.
  • the execution timing of the phase 90 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 90 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
  • the execution timing of the phase 180 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 180 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
  • the execution timing of the phase 270 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 270 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
  • the ON timing of the tap A and the ON timing of the tap B Is the timing when the phase is inverted.
  • the tap A of the pixel 31 is the execution timing of 0 degree phase
  • the tap B is the execution timing of 180 degree phase
  • the tap A of the pixel 31 is the execution timing of 90 degree phase
  • the tap B is.
  • the execution timing has a phase of 270 degrees.
  • the sensor 20 may receive light (imaging) for at least two frames.
  • a method of acquiring a detection signal of four phases by receiving light of two frames and calculating a depth value d in this way is called a two-phase method.
  • the light receiving sensor 20 may have a method called a 4 Phase method in which each of the tap A and the tap B acquires a detection signal of four phases of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree.
  • the 4-Phase method light reception (imaging) of 4 frames is required, but the result of removing the characteristic variation between the taps of the tap A and the tap B can be obtained.
  • the light receiving sensor 20 adopts, for example, a 4 Phase method
  • the light receiving timing is sequentially switched in the order of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree in frame units, and the received light amount (accumulation) of the reflected light at each light receiving timing Charge).
  • the timing at which the reflected light is incident is shaded.
  • the depth value d from the distance measuring device 1 to the object can be calculated.
  • the reliability conf is a value representing the intensity of the light received by each pixel, and is also called a signal intensity conf, and can be calculated by, for example, the following equation (5).
  • the drive control circuit 33 of the pixel array unit 32 emits irradiation light at a timing (light receiving timing) of accumulating the electric charge generated by the photodiode 41 of each pixel 31 on the tap A or the tap B.
  • Distribution signals GDA and GDB having a phase of 0 degrees, a phase of 90 degrees, a phase of 180 degrees, and a phase of 270 degrees are generated with respect to the timing.
  • the phase difference between the light emission timing and the light reception timing is the drive phase difference ⁇ D set by the phase setting unit 12.
  • FIG. 5 is a diagram illustrating processing from the timing signal generation unit 11 to the coding unit 16 of the distance measuring device 1.
  • the timing signal generation unit 11 generates a modulation signal of the modulation frequency Fmod and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15.
  • Phase setting unit 12 sets the drive phase difference phi D, supplied to one of the light source modulation unit 13 or the sensor modulation section 14.
  • Figure 5 shows an example of a case where the drive phase difference phi D set by the phase setting section 12 is supplied to the light source modulation section 13.
  • the light source modulation unit 13, the modulated signal from the timing signal generator 11, and supplies to the encoding unit 16 generates a light source modulation signal obtained by shifting the drive phase difference phi D phase.
  • the sensor modulation unit 14 supplies the modulation signal from the timing signal generation unit 11 as it is to the coding unit 16 as a sensor modulation signal. Therefore, the sensor modulation signal shown in FIG. 5 is the same as the modulation signal generated by the timing signal generation unit 11.
  • the control unit 21 determines the code period (Chip Length) and supplies it to the code generation unit 15.
  • the code generation unit 15 randomly generates a code of 0 or 1 in a code cycle unit (two cycles in FIG. 5) and supplies the code to the coding unit 16.
  • the reference numerals are generated in the order of “0”, “1”, “0”, and “1”.
  • the coding unit 16 performs phase shift processing according to the code on the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14, and serves as a coded signal.
  • a coded light source modulated signal and a coded sensor modulated signal are generated.
  • the phase shift process according to the code is a process of generating a coded signal having the same phase when the code is 0 and generating a coded signal having an inverted phase when the code is 1.
  • the coded light source modulated signal and the coded sensor modulated signal when the code is 0 are the same as the light source modulated signal and the sensor modulated signal, and the coded light source modulated signal and the code when the code is 1.
  • the sensor modulation signal is a signal whose phase is inverted (180 degree shift) with respect to the light source modulation signal and the sensor modulation signal.
  • FIG. 6 is a diagram illustrating processing of the light source delay unit 17 and the sensor delay unit 18 of the distance measuring device 1.
  • the control unit 21 determines the delay amount ⁇ D according to the measurement mode and supplies it to either the light source delay unit 17 or the sensor delay unit 18.
  • the light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by a delay amount ⁇ D with respect to the coded light source modulation signal supplied from the coding unit 16, and supplies the delayed light source modulation signal to the light emitting source 19.
  • the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by a delay amount ⁇ D with respect to the coded sensor modulation signal supplied from the coding unit 16, and supplies the delay sensor modulation signal to the light receiving sensor 20.
  • the delayed light source modulated signal and the delayed sensor modulated signal are shown.
  • the delay light source modulation signal becomes the same signal as the upper coded light source modulation signal, and the delay sensor modulation signal becomes the upper coded sensor modulation signal.
  • the signal is phase-delayed by the delay amount ⁇ D.
  • the delay light source modulation signal becomes a signal whose phase is delayed by the delay amount ⁇ D with respect to the coded light source modulation signal in the upper stage, and the delay sensor modulation signal is , It is the same signal as the coded sensor modulation signal in the upper stage.
  • the light emitting source 19 emits light while being modulated at a timing corresponding to the delayed light source modulation signal supplied from the light source delay unit 17, and irradiates the object with irradiation light.
  • the light receiving sensor 20 receives reflected light at each pixel 31 at a timing corresponding to the delay sensor modulation signal supplied from the sensor delay unit 18, and outputs a pixel signal corresponding to the amount of the received reflected light.
  • distribution signals GDA and GDB that alternately turn on tap A and tap B are generated based on the delay sensor modulation signal supplied from the sensor delay unit 18.
  • the delay sensor modulation signal supplied from the sensor delay unit 18 is referred to as the distribution signal GDA
  • the signal in which the phase of the delay sensor modulation signal is inverted is referred to as the distribution signal GDB.
  • the distance measuring device 1 encodes the light source modulation signal and the sensor modulation signal in the coding unit 16 in code cycle units according to the code generated by the code generation unit 15.
  • the coded coded light source modulation signal and the coded sensor modulation signal are BPSK (Binary Phase Shift Keying) whose phases are shifted to 0 degrees and 180 degrees according to the code (binary) of "0" or "1". It becomes a signal.
  • the light source delay unit 17 and the sensor delay unit 18 generate a delay light source modulation signal and a delay sensor modulation signal in which either the coded light source modulation signal or the coded sensor modulation signal is phase-delayed by a delay amount ⁇ D.
  • the light emitting source 19 emits light and the light receiving sensor 20 receives light.
  • the signal intensity of the reflected light is a reliability conf calculated by the equation (5) based on the detection signal of the pixel 31.
  • the timing signal shown in the upper left part of FIG. 7 shows the waveform of the irradiation light and the reflected light reflected by the object when the object to be measured is ideally located at a position of zero distance.
  • the light receiving timings of tap A and tap B that receive light are shown. Since it is assumed that the object to be measured is ideally located at a distance of zero, the waveform of the irradiation light is also the waveform of the reflected light.
  • the timing at which the reflected light is incident is marked with a pattern.
  • the light receiving timings of tap A and tap B that receive the reflected light reflected by the object are shown.
  • the waveform of the irradiation light is the same as the waveform of the irradiation light in the upper stage.
  • the timing at which the reflected light is incident is marked with a pattern.
  • the signal intensity Conf detected by each pixel 31 of the light receiving sensor 20 when the object shown in the upper left is at a position of zero distance is the signal intensity C2 shown in the graph on the right side of FIG.
  • the signal strength C2 is the case where the drive phase difference ⁇ D is other than 0, and when the drive phase difference ⁇ D is 0, the signal strength Conf is the maximum signal strength C1.
  • the reflected light reaches the light receiving sensor 20 with a delay of 2Pi as shown in the lower left side.
  • the light reception timing for the cycle is lost, and the signal strength Conf becomes the signal strength C3 shown in the graph on the right side of FIG. 7.
  • the signal intensity Conf is attenuated according to the distance and a predetermined distance. Becomes zero.
  • the distance of the zero point where the signal strength Conf becomes zero depends on the code period (Chip Length).
  • the distance at the zero point is a distance equivalent to 4Pi.
  • the distance at the zero point where the signal strength Conf becomes zero is a distance equivalent to (Chip Length x 2Pi).
  • the signal strength Conf after a predetermined distance can be set to zero. It is possible to cut off signals after a predetermined distance. Further, by controlling the code period (Chip Length), the distance at which the signal is cut off can be set to an arbitrary distance. According to the distance measuring device 1, it is possible to measure a distance limited to a desired measurement range. However, there is a trade-off with the SN ratio of the effective ranging range.
  • Short-range mode operation Next, the short-distance mode (second measurement mode) will be described with reference to FIG.
  • the timing signal shown in the upper left part of FIG. 9 shows the waveform of the irradiation light in the normal mode shown in FIG. 7 and the reception timing of tap A and tap B.
  • the delay amount ⁇ D 0.
  • the timing signal shown in the lower left part of FIG. 9 shows the waveform of the irradiation light in the short-distance mode and the reception timing of tap A and tap B.
  • the control unit 21 sets the delay amount ⁇ D to a predetermined value and supplies it to the light source delay unit 17.
  • the light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by a delay amount ⁇ D with respect to the coded light source modulation signal supplied from the coding unit 16.
  • the relationship between the distance to the object and the signal strength Conf is such that it is shifted to the left side, which is the short distance side, from the state of the normal mode, and the distance at the zero point. is a 2Pi + phi D corresponding distance.
  • the distance measuring device 1 sets the delay amount ⁇ D to a predetermined value and delays the emission of the irradiation light, so that the distance is larger than the distance equivalent to (Chip Length ⁇ 2Pi) in the normal mode. Further, it is possible to perform measurement with a limited range of distance measurement on the short distance side.
  • the short distance to be measured can be arbitrarily set by controlling the delay amount ⁇ D. However, there is a trade-off with the SN ratio of the effective ranging range.
  • the timing signal shown in the upper left part of FIG. 10 shows the waveform of the irradiation light in the normal mode shown in FIG. 7 and the reception timing of tap A and tap B.
  • the delay amount ⁇ D 0.
  • the timing signal shown in the lower left part of FIG. 10 shows the waveform of the irradiation light in the long-distance mode and the reception timing of the tap A and the tap B.
  • the control unit 21 sets the delay amount ⁇ D to a predetermined value and supplies it to the sensor delay unit 18.
  • the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by a delay amount ⁇ D with respect to the coded sensor modulation signal supplied from the coding unit 16.
  • the delay amount ⁇ D is set to 2Pi.
  • the relationship between the distance to the object and the signal intensity Conf is shown in the graph on the right side of FIG. The relationship is such that the signal is shifted to the right side, which is a long distance side, from the state of the normal mode, and the signal strength Conf becomes the maximum (signal strength C1) at a distance equivalent to 2Pi.
  • the distance measuring device 1 sets the delay amount ⁇ D to a predetermined value and delays the light reception of the light receiving sensor 20, so that the distance corresponding to (Chip Length ⁇ 2Pi) in the normal mode is increased. It is also possible to perform measurements with a limited range of distance measurement on the far side. However, the distance measurement performance on the short distance side is deteriorated, and the distance measurement can be performed so that the distance measurement performance is maximized at a desired distance (2Pi in FIG. 10).
  • the distance to be measured can be arbitrarily set by controlling the delay amount ⁇ D. Since the long-distance mode can attenuate the short-distance signal, for example, the influence of scattered light generated between the lens and the sensor can be reduced, and the signal amount of the distant subject can be relatively increased.
  • step S1 the control unit 21 determines the code period (Chip Length) and the delay amount ⁇ D according to the measurement mode.
  • the determined code period is supplied to the code generation unit 15.
  • the determined delay amount ⁇ D is supplied to the light source delay unit 17 when the measurement mode is the short-distance mode, and is supplied to the sensor delay unit 18 when the measurement mode is the long-distance mode.
  • the delay amount ⁇ D 0, so that the delay amount ⁇ D is not supplied to either the light source delay unit 17 or the sensor delay unit 18.
  • step S2 the timing signal generation unit 11 generates a modulation signal of the modulation frequency Fmod and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15.
  • step S3 the phase setting section 12 sets the drive phase difference phi D, supplied to one of the light source modulation unit 13 or the sensor modulation section 14.
  • the drive phase difference phi D is supplied to the light source modulation section 13.
  • step S3 of the first frame of the 4Phase system for example, 0 is set as the drive phase difference ⁇ D.
  • step S4 the light source modulation unit 13 and the sensor modulation unit 14 generates a modulated signal corresponding to the drive phase difference phi D from the phase setting portion 12. Specifically, the light source modulation unit 13, the modulated signal from the timing signal generator 11 generates a light source modulation signal obtained by shifting the phase by driving the phase difference phi D, supplied to the encoding unit 16. The light source modulation unit 13 supplies the modulation signal from the timing signal generation unit 11 as it is to the coding unit 16 as a light source modulation signal.
  • step S5 the code generation unit 15 randomly generates a code of 0 or 1 in the code cycle unit set by the control unit 21 and supplies the code to the coding unit 16.
  • the code generation unit 15 sets 0 or 1 in units of 2 cycles of the modulation frequency Fmod.
  • a code is randomly generated and supplied to the coding unit 16.
  • steps S4 and S5 may be executed in the reverse order, or may be executed in parallel.
  • step S6 the coding unit 16 performs phase shift processing according to the code on the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14, and obtains a code.
  • a coded light source modulated signal and a coded sensor modulated signal are generated as a coded signal.
  • the generated coded light source modulation signal is supplied to the light source delay unit 17, and the generated coded sensor modulation signal is supplied to the sensor delay unit 18.
  • step S7 the light source delay unit 17 and the sensor delay unit 18 generate a delay light source modulation signal and a delay sensor modulation signal whose phase is delayed by the delay amount ⁇ D supplied from the control unit 21.
  • the delay amount ⁇ D when the delay amount ⁇ D is supplied to the light source delay unit 17, the light source delay unit 17 delays the phase by the delay amount ⁇ D with respect to the coded light source modulation signal from the coding unit 16.
  • a light source modulation signal is generated and supplied to the light emitting source 19.
  • the sensor delay unit 18 When the delay amount ⁇ D is supplied to the sensor delay unit 18, the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by the delay amount ⁇ D with respect to the coded sensor modulation signal from the coding unit 16. Then, it is supplied to the light receiving sensor 20. If the delay amount ⁇ D is not supplied, the input modulation signal is output as it is.
  • step S8 the ranging device 1 emits the irradiation light and receives the reflected light.
  • the light emitting source 19 emits light while being modulated at a timing corresponding to the delayed light source modulation signal supplied from the light source delay unit 17, and irradiates the object with irradiation light.
  • Each pixel 31 of the light receiving sensor 20 receives the reflected light at the timing corresponding to the delay sensor modulation signal supplied from the sensor delay unit 18, and outputs the pixel signal corresponding to the amount of the received reflected light to the control unit 21. ..
  • step S9 the control unit 21 of the ranging device 1 determines whether the phase data of all frames has been acquired. Specifically, in the case of the 2Phase method, the control unit 21 determines whether or not the light reception for two frames has been performed, and if the light reception for two frames is performed, it is determined that the phase data of all the frames has been acquired. Further, for example, in the case of the 4Phase method, the control unit 21 determines whether or not the light reception for 4 frames has been performed, and if the light reception for 4 frames is performed, it is determined that the phase data of all the frames has been acquired.
  • step S9 If it is determined in step S9 that the phase data of all frames has not been acquired, the process returns to step S3, and the processes of steps S3 to S7 described above are repeated. In the next step S3, if the 4Phase method, for example, driving a phase difference phi D is set to 90 degrees.
  • step S9 if it is determined in step S9 that the phase data of all frames has been acquired, the process proceeds to step S10, and the control unit 21 generates and outputs a depth map and a reliability map. More specifically, the control unit 21 calculates the depth value d for each pixel 31 of the pixel array unit 32 based on the acquired phase data (detection signal) of all frames by the equation (3), and also calculates the depth value d by the equation (3). Calculate the reliability conf according to (5). Then, the control unit 21 generates and outputs a depth map in which the depth value is stored as the pixel value of each pixel 31 and a reliability map in which the reliability conf is stored as the pixel value of each pixel 31.
  • the measurement mode is set to the normal mode, the short distance mode, or the long distance mode, and the code period (Chip Length) and the delay amount ⁇ D are set to predetermined values.
  • the code period (Chip Length) and the delay amount ⁇ D are set to predetermined values.
  • Coded light source modulation signal a driving phase difference ⁇ D, 0 °, 90 ° , 180 °, and shows a case of setting the respective 270 °.
  • the lower part of FIG. 12 shows the result of simulating the relationship between the distance to the object and the signal strength Conf with the settings shown in the upper part.
  • FIG. 13 the types of the signal shown in the upper row and the graph shown in the lower row are the same as those in FIG. 12, so detailed description thereof will be omitted, and only the relationship between the distance to the object and the signal strength Conf will be described. The same applies to FIGS. 14 to 18 described later.
  • the distance at the zero point in other words, the distance at which the signal is cut off can be set to an arbitrary distance.
  • the delay amount ⁇ D is supplied to the sensor delay unit 18, and a delay sensor modulation signal whose phase is delayed by the delay amount ⁇ D with respect to the coded sensor modulation signal is generated.
  • the delay amount ⁇ D is 1 Pi.
  • the distance at which the signal strength Conf peaks is from 0 to Pi, and is shifted in the long distance direction.
  • the distance of the zero point is also changed from 2Pi to 3Pi, which is shifted in the long distance direction.
  • the delay amount ⁇ D supplied to the sensor delay unit 18 is 2Pi.
  • the distance at which the signal strength Conf peaks is from 0 to 2Pi, which is shifted in the long distance direction.
  • the distance of the zero point is also changed from 4Pi to 6Pi, which is shifted in the long distance direction.
  • the distance between the peak and the zero point of the signal strength Conf is set to be farther than the normal mode. It can be seen that it can be set to any distance.
  • a delay amount ⁇ D is supplied to the light source delay unit 17, and a delayed light source modulation signal whose phase is delayed by the delay amount ⁇ D with respect to the coded light source modulation signal is generated.
  • the delay amount ⁇ D is 1 Pi.
  • the peak value of the signal strength Conf (the signal strength Conf at a distance of zero) is half the value of the normal mode of FIG.
  • the distance of the zero point is also 1/2 Pi of 2Pi in the normal mode, and is shifted in the short distance direction.
  • the delay amount ⁇ D supplied to the light source delay unit 17 is 1 Pi.
  • the peak value of the signal strength Conf (signal strength Conf at zero distance) is 3/4 of the value of the normal mode of FIG.
  • the distance of the zero point is also 3/4 of 3Pi of 4Pi in the normal mode, and is shifted in the short distance direction.
  • the delay amount ⁇ D supplied to the light source delay unit 17 is Pi ⁇ 7/4.
  • the peak value of the signal strength Conf (signal strength Conf at zero distance) is 0.16 times the value of the normal mode of FIG.
  • the distance of the zero point is Pi / 4, which is 1/8 of 2Pi in the normal mode, and is shifted in the short distance direction.
  • the distance between the peak and the zero point of the signal intensity Conf can be set closer to the normal mode. It can be seen that it can be set to any distance.
  • FIG. 19 is a cross-sectional view of a smartphone 101 as an electronic device in which the distance measuring device 1 is incorporated, as viewed from a surface parallel to the display surface.
  • the distance measuring device 1 is incorporated in the smartphone 101.
  • a cover glass 102 is arranged on the front surface of the display panel (not shown) of the smartphone 101, and the distance measuring device 1 is arranged on the back side (inside the main body) of the display panel.
  • the irradiation light L1 emitted from the light emitting source 19 of the distance measuring device 1 passes through the cover glass 102 and is irradiated to the subject 103.
  • the subject 103 is, for example, a user using the smartphone 101.
  • the irradiation light L1 is reflected by the subject 103, passes through the cover glass 102 as the reflected light L2, and is incident on the light receiving sensor 20 via the lens 104.
  • foreign matter 121 such as dust or fingerprints may adhere to the surface of the cover glass 102. Since the user does not know where the distance measuring device 1 is located inside the smartphone 101, the user often does not notice the influence of the foreign matter 121 on the distance measuring device 1.
  • the irradiation light L1 is reflected by the foreign matter 121, refracted like the reflected light L3, and incident on the light receiving sensor 20.
  • the measurement is performed excluding the subject 103 by setting the measurement mode to the short distance mode. be able to.
  • FIG. 20 is a flowchart of the distance measuring process of the distance measuring device 1 that measures the distance to the original object to be measured while detecting the foreign matter described in FIG. This process is started, for example, when a distance measurement instruction is supplied from the control unit (AP) of the smartphone 101 in which the distance measurement device 1 is incorporated.
  • AP control unit
  • step S21 the control unit 21 sets the measurement mode to the short-distance mode, and in step S22, the distance measuring device 1 executes the measurement in the short-distance mode.
  • the code period (Chip Length) and the delay amount ⁇ D in the short-distance mode are set to optimum values when measuring the distance corresponding to the case where the irradiation light L1 is reflected on the surface of the cover glass 102.
  • the distance measuring device 1 executes the distance measuring process shown in FIG. 11, and the control unit 21 generates a depth map and a reliability map.
  • step S23 the control unit 21 determines whether or not a foreign substance has been detected based on the depth map and the reliability map acquired in the short-distance mode. For example, when the distance to the surface of the cover glass 102 is detected in the depth map, the control unit 21 determines that a foreign substance has been detected.
  • step S23 If it is determined in step S23 that a foreign matter has been detected, the process proceeds to step S24, and the control unit 21 notifies the control unit of the smartphone 101 of the detection of the foreign matter.
  • the control unit of the smartphone 101 notified from the distance measuring device 1 that the foreign matter has been detected displays, for example, an alert screen requesting the user to remove the foreign matter as shown in FIG. 21 on the display.
  • the display 141 of the smartphone 101 shows the message 142 "There is dust or fingerprints on the front of the camera. Wipe the area of the red line to clean it.” And the red area 143 indicating the area of wiping. Is displayed.
  • the red area 143 corresponds to the position of the distance measuring device 1 inside the smartphone 101.
  • the foreign matter 121 is removed by the user wiping the vicinity of the red area 143 based on the message 142.
  • the user When the user finishes the wiping work, the user operates (presses) the wiping end button 144 as an operation for the message 142.
  • the wiping end button 144 When the wiping end button 144 is operated by the user, the distance measuring instruction is again supplied from the control unit of the smartphone 101 to the distance measuring device 1.
  • step S24 after the control unit 21 notifies the control unit of the smartphone 101 of the detection of a foreign substance, in step S25, the control unit 21 issues a distance measurement instruction corresponding to the operation of the wiping end button 144 to the control unit of the smartphone 101. It is determined whether or not the notification has been sent from, and the device waits until it is determined that the distance measurement instruction has been notified.
  • step S25 when it is determined in step S25 that the distance measurement instruction has been notified, the process returns to step S22, and the measurement in the short-distance mode is executed again.
  • step S23 If it is determined in step S23 that no foreign matter has been detected, the process proceeds to step S26, the control unit 21 sets the measurement mode to the normal mode, and in step S27, the distance measuring device 1 sets the normal mode. Perform the measurement with. Then, in step S28, the control unit 21 outputs the measurement result in the normal mode. That is, the control unit 21 stores the depth map obtained as a result of the measurement in the normal mode as the pixel value of each pixel 31 and the reliability conf obtained as a result of the measurement in the normal mode for each pixel 31. A reliability map stored as a pixel value of is generated, output to the control unit of the smartphone 101, and the process is completed.
  • the distance measuring device 1 sets the measurement mode to the normal mode, the short distance mode, or the long distance mode according to the detection target, thereby measuring the distance limited to a desired distance range. It can be carried out.
  • the distance measuring device 1 described above can be mounted on an electronic device such as a smartphone, a tablet terminal, a mobile phone, a personal computer, a game machine, a television receiver, a wearable terminal, a digital still camera, or a digital video camera.
  • an electronic device such as a smartphone, a tablet terminal, a mobile phone, a personal computer, a game machine, a television receiver, a wearable terminal, a digital still camera, or a digital video camera.
  • FIG. 22 is a block diagram showing a configuration example of a smartphone as an electronic device equipped with a ranging module.
  • the distance measuring module 202, the image pickup device 203, the display 204, the speaker 205, the microphone 206, the communication module 207, the sensor unit 208, the touch panel 209, and the control unit 210 are connected via the bus 211. Is connected and configured. Further, the control unit 210 has functions as an application processing unit 221 and an operation system processing unit 222 by executing a program by the CPU.
  • the distance measuring device 1 of FIG. 1 is applied to the distance measuring module 202.
  • the distance measuring module 202 is arranged in front of the smartphone 201, and by performing distance measurement for the user of the smartphone 201, the depth value of the surface shape of the user's face, hand, finger, etc. is measured as a distance measurement result. Can be output as.
  • the image pickup device 203 is arranged in front of the smartphone 201, and by taking an image of the user of the smartphone 201 as a subject, the image taken by the user is acquired. Although not shown, the image pickup device 203 may be arranged on the back surface of the smartphone 201.
  • the display 204 displays an operation screen for performing processing by the application processing unit 221 and the operation system processing unit 222, an image captured by the image pickup device 203, and the like.
  • the speaker 205 and the microphone 206 for example, output the voice of the other party and collect the voice of the user when making a call by the smartphone 201.
  • the communication module 207 communicates via the communication network.
  • the sensor unit 208 senses speed, acceleration, proximity, etc., and the touch panel 209 acquires a touch operation by the user on the operation screen displayed on the display 204.
  • the application processing unit 221 performs processing for providing various services by the smartphone 201.
  • the application processing unit 221 can create a face by computer graphics that virtually reproduces the user's facial expression based on the depth value supplied from the distance measuring module 202, and can perform a process of displaying the face on the display 204. .. Further, the application processing unit 221 can perform a process of creating, for example, three-dimensional shape data of an arbitrary three-dimensional object based on the depth value supplied from the distance measuring module 202.
  • the operation system processing unit 222 performs processing for realizing the basic functions and operations of the smartphone 201.
  • the operation system processing unit 222 can perform a process of authenticating the user's face and unlocking the smartphone 201 based on the depth value supplied from the distance measuring module 202.
  • the operation system processing unit 222 performs, for example, a process of recognizing a user's gesture based on the depth value supplied from the distance measuring module 202, and performs a process of inputting various operations according to the gesture. Can be done.
  • the measurement mode can be switched to a normal mode, a short distance mode, or a long distance mode according to the purpose of the application. It can be measured, and the distance can be measured within a desired distance range.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
  • FIG. 23 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
  • the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
  • the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects the in-vehicle information.
  • a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
  • the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
  • the audio image output unit 12052 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
  • FIG. 24 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example.
  • the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
  • the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
  • the images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 24 shows an example of the photographing range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
  • the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
  • the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more.
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
  • the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is used via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
  • pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
  • the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured image of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the above is an example of a vehicle control system to which the technology according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to the vehicle exterior information detection unit 12030 and the vehicle interior information detection unit 12040 among the configurations described above.
  • processing for recognizing the driver's gesture is performed, and various types according to the gesture (for example, It can perform operations on audio systems, navigation systems, air conditioning systems) and detect the driver's condition more accurately.
  • the distance measurement by the distance measuring device 1 can be used to recognize the unevenness of the road surface and reflect it in the control of the suspension.
  • the structure of the photodiode 41 of the pixel 31 includes a distance measuring sensor having a CAPD (Current Assisted Photonic Demodulator) structure and a gate type distance measuring sensor that alternately applies the charge of the photodiode to the two gates. It can be applied to a ranging sensor having a structure that distributes charges to one charge storage unit.
  • CAPD Current Assisted Photonic Demodulator
  • the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
  • the configurations described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit).
  • a configuration other than the above may be added to the configuration of each device (or each processing unit).
  • a part of the configuration of one device (or processing unit) may be included in the configuration of another device (or other processing unit). ..
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • the present technology can have the following configurations.
  • the coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • the coding unit that generates the modulation signal of the sensor A light source delay unit that generates a delayed light source modulated signal in which the phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal.
  • a distance measuring device including a sensor delay unit that generates a delay sensor modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded sensor modulation signal.
  • the control unit supplies the predetermined delay amount determined when the measurement mode is the second mode to the light source delay unit, and determines the predetermined delay amount when the measurement mode is the third mode.
  • the distance measuring device according to (2) or (3) which is supplied to the sensor delay unit.
  • the measurement mode includes a first mode in which the predetermined delay amount is zero and a second mode in which the predetermined delay amount is positive.
  • the control unit sets the measurement mode to the second mode and executes distance measurement, and then sets the measurement mode to the first mode to execute distance measurement.
  • the control according to (2).
  • Distance measuring device (6)
  • (7) The distance measuring device according to any one of (1) to (6), further comprising a code generation unit that generates the predetermined code in units of integral multiples of the period of the light source modulation signal and the sensor modulation signal.
  • a distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned.
  • the coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • a method of controlling a distance measuring device that generates a modulated signal. (10) The light emitting source that irradiates the irradiation light and A light receiving sensor that receives the reflected light that is reflected by the object and returned.
  • the coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • the coding unit that generates the modulation signal of the sensor A light source delay unit that generates a delayed light source modulated signal in which the phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal.
  • An electronic device including a distance measuring device including a sensor delay unit that generates a delay sensor modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded sensor modulation signal.
  • 1 ranging device 11 timing signal generator, 12 phase setting unit, 13 light source modulation unit, 14 sensor modulation unit, 15 code generator, 16 coding unit, 17 light source delay unit, 18 sensor delay unit, 19 light source, 20 light source sensor, 21 control unit, 101 smartphone, 201 smartphone, 202 ranging module

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente divulgation concerne : un dispositif de télémétrie configuré de façon à pouvoir mesurer une distance limitée à une plage souhaitée de mesures ; un procédé de commande associé ; et un appareil électronique. Le dispositif de télémétrie comprend : une source d'émission lumineuse, qui émet une lumière rayonnée ; un capteur de réception lumineuse, qui reçoit la lumière réfléchie quand la lumière rayonnée a été réfléchie par un objet puis renvoyée ; une unité de codage, qui génère un signal codé de modulation de source lumineuse et un signal codé de modulation de capteur en soumettant un signal de modulation de source lumineuse, qui commande une synchronisation d'émission lumineuse de la source d'émission lumineuse et un signal de modulation de capteur, qui commande la synchronisation de réception lumineuse du capteur de réception lumineuse, à un codage correspondant à un code prescrit ; une unité de retard de source lumineuse, qui génère un signal retardé de modulation de source lumineuse pour lequel la phase a été retardée d'un degré prescrit de retard par rapport au signal codé de modulation de source lumineuse ; et une unité de retard de capteur, qui génère un signal retardé de modulation de capteur pour lequel la phase a été retardée d'un degré prescrit de retard par rapport au signal codé de modulation de capteur. La présente divulgation peut par exemple s'appliquer à un module de télémétrie qui mesure la distance d'un sujet et autres.
PCT/JP2020/045758 2019-12-23 2020-12-09 Dispositif de télémétrie, procédé de commande de dispositif de télémétrie et appareil électronique WO2021131684A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019231535A JP2021099271A (ja) 2019-12-23 2019-12-23 測距装置およびその制御方法、並びに、電子機器
JP2019-231535 2019-12-23

Publications (1)

Publication Number Publication Date
WO2021131684A1 true WO2021131684A1 (fr) 2021-07-01

Family

ID=76541050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/045758 WO2021131684A1 (fr) 2019-12-23 2020-12-09 Dispositif de télémétrie, procédé de commande de dispositif de télémétrie et appareil électronique

Country Status (2)

Country Link
JP (1) JP2021099271A (fr)
WO (1) WO2021131684A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11281744A (ja) * 1998-01-28 1999-10-15 Nikon Corp 距離測定装置
JP2002181934A (ja) * 2000-12-15 2002-06-26 Nikon Corp 計時装置、計時方法、及び測距装置
JP2005300233A (ja) * 2004-04-07 2005-10-27 Denso Corp 車両用レーダ装置
WO2010098454A1 (fr) * 2009-02-27 2010-09-02 パナソニック電工株式会社 Appareil de mesure de la distance
JP2016045066A (ja) * 2014-08-22 2016-04-04 浜松ホトニクス株式会社 測距方法及び測距装置
WO2016075885A1 (fr) * 2014-11-11 2016-05-19 パナソニックIpマネジメント株式会社 Dispositif et procédé de détection de distance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11281744A (ja) * 1998-01-28 1999-10-15 Nikon Corp 距離測定装置
JP2002181934A (ja) * 2000-12-15 2002-06-26 Nikon Corp 計時装置、計時方法、及び測距装置
JP2005300233A (ja) * 2004-04-07 2005-10-27 Denso Corp 車両用レーダ装置
WO2010098454A1 (fr) * 2009-02-27 2010-09-02 パナソニック電工株式会社 Appareil de mesure de la distance
JP2016045066A (ja) * 2014-08-22 2016-04-04 浜松ホトニクス株式会社 測距方法及び測距装置
WO2016075885A1 (fr) * 2014-11-11 2016-05-19 パナソニックIpマネジメント株式会社 Dispositif et procédé de détection de distance

Also Published As

Publication number Publication date
JP2021099271A (ja) 2021-07-01

Similar Documents

Publication Publication Date Title
US10746874B2 (en) Ranging module, ranging system, and method of controlling ranging module
JP7214363B2 (ja) 測距処理装置、測距モジュール、測距処理方法、およびプログラム
WO2021085128A1 (fr) Dispositif de mesure de distance, procédé de mesure, et système de mesure de distance
JP7321834B2 (ja) 照明装置、および、測距モジュール
US11561303B2 (en) Ranging processing device, ranging module, ranging processing method, and program
WO2021065494A1 (fr) Capteur de mesure de distances, procédé de traitement de signaux et module de mesure de distances
WO2020246264A1 (fr) Capteur de mesure de distance, procédé de traitement de signal et module de mesure de distance
WO2020209079A1 (fr) Capteur de mesure de distance, procédé de traitement de signal et module de mesure de distance
WO2021065500A1 (fr) Capteur de mesure de distance, procédé de traitement de signal, et module de mesure de distance
WO2021131684A1 (fr) Dispositif de télémétrie, procédé de commande de dispositif de télémétrie et appareil électronique
US20220381917A1 (en) Lighting device, method for controlling lighting device, and distance measurement module
WO2021039458A1 (fr) Capteur de mesure de distance, procédé de commande associé et module de mesure de distance
WO2021106624A1 (fr) Capteur de mesure de distance, système de mesure de distance, et appareil électronique
JP7490653B2 (ja) 測定装置および測定方法、並びにプログラム
WO2021065495A1 (fr) Capteur de télémétrie, procédé de traitement de signal, et module de télémétrie
JP7494200B2 (ja) 照明装置、照明装置の制御方法、および、測距モジュール
US20220413144A1 (en) Signal processing device, signal processing method, and distance measurement device
JP7476170B2 (ja) 信号処理装置、信号処理方法、および、測距モジュール
WO2021106623A1 (fr) Capteur de mesure de distance, système de mesure de distance et appareil électronique
WO2022004441A1 (fr) Dispositif de télémétrie et procédé de télémétrie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20906321

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20906321

Country of ref document: EP

Kind code of ref document: A1