WO2021085128A1 - Distance measurement device, measurement method, and distance measurement system - Google Patents

Distance measurement device, measurement method, and distance measurement system Download PDF

Info

Publication number
WO2021085128A1
WO2021085128A1 PCT/JP2020/038712 JP2020038712W WO2021085128A1 WO 2021085128 A1 WO2021085128 A1 WO 2021085128A1 JP 2020038712 W JP2020038712 W JP 2020038712W WO 2021085128 A1 WO2021085128 A1 WO 2021085128A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
distance measurement
light
distance measuring
measurement value
Prior art date
Application number
PCT/JP2020/038712
Other languages
French (fr)
Japanese (ja)
Inventor
光永 知生
セドリック カロン
イェルーン ヘルマンス
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2021085128A1 publication Critical patent/WO2021085128A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present technology relates to a distance measuring device, a measuring method, and a distance measuring system, and more particularly to a distance measuring device, a measuring method, and a distance measuring system capable of realizing a wide measurement range by using spot light.
  • a distance measuring module is mounted on a mobile terminal such as a so-called smartphone, which is a small information processing device having a communication function.
  • Examples of the distance measuring method in the distance measuring module include the Indirect ToF (Time of Flight) method and the Structured Light method.
  • the IndirectToF method is a method that irradiates light toward an object, detects the light reflected on the surface of the object, and calculates the distance to the object based on the measured value obtained by measuring the flight time of the light. (See, for example, Patent Document 1).
  • the Structured Light method is a method of irradiating a pattern light toward an object and calculating the distance to the object based on an image obtained by capturing the distortion of the pattern on the surface of the object.
  • planar light and spot light can be used as the irradiation light to irradiate the object.
  • Plane light can irradiate a wide range of light, but since the emission intensity is dispersed, it is not possible to measure an object at a long distance (the measurement range is short).
  • spot light can be measured farther because the density of light power can be increased, and the influence of multipath can be suppressed, so that improvement in accuracy can be expected.
  • the irradiation light is spot light
  • the amount of light received will be large for objects with a short distance, so the pixels will be saturated and measurement may not be possible.
  • This technology was made in view of such a situation, and makes it possible to realize a wide measurement range by using spot light.
  • the distance measuring device on the first side of the present technology is A ranging sensor that receives the reflected light that is reflected by an object and has two types of brightness, a bright part and a dark part that are emitted from the light source device.
  • the time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned
  • the position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value.
  • the measurement method of the second aspect of the present technology is The distance measuring device
  • the time from when the pattern light having two kinds of brightness of the bright part and the dark part irradiated from the light source device is irradiated to when it is reflected by the object and received as the reflected light is detected as the phase difference, and is based on the phase difference.
  • the first ranging value which is the distance to the object, is calculated, the position of the bright part of the pattern light is detected, and the detected position of the bright part is used to reach the object by the principle of triangular survey.
  • a second distance measurement value, which is a distance is calculated, and a fusion distance measurement value obtained by fusing the first distance measurement value and the second distance measurement value is calculated.
  • the ranging system of the third aspect of this technology is A light source device that irradiates a pattern light having two types of brightness, a bright part and a dark part, It is equipped with a ranging device that receives the reflected light that is reflected by the object and returned.
  • the distance measuring device is A distance measuring sensor that receives the reflected light and The time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned The position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value. It is provided with a signal processing unit that calculates a fusion distance measurement value that is a fusion of the second distance measurement value and the second distance measurement value.
  • the pattern light having two types of brightness, the bright part and the dark part, emitted from the light source device is received by the object and the reflected light returned is received, and the pattern light is received.
  • the time from when the light is irradiated to when it is received as the reflected light is detected as a phase difference, and the first ranging value, which is the distance to the object, is calculated based on the phase difference, and the pattern light is used.
  • the position of the bright part is detected, and the second distance measurement value, which is the distance to the object, is calculated by the principle of triangulation using the detected position of the bright part, and is combined with the first distance measurement value.
  • a fusion distance measurement value that is fused with the second distance measurement value is calculated.
  • the distance measuring device and the distance measuring system may be independent devices or may be modules incorporated in other devices.
  • FIG. 1 shows a schematic configuration example of a distance measuring system to which the present technology is applied.
  • the distance measuring system 1 shown in FIG. 1 includes a light source device 11, a light emitting side optical system 12, a distance measuring device 21, and a light receiving side optical system 22.
  • the light source device 11 generates and irradiates the pattern light 15 having two types of brightness, a bright part and a dark part.
  • the pattern light 15 has, for example, a plurality of spots SP having a dot (circle) shape arranged at regular or irregular predetermined intervals as shown in FIG. 1 as a bright portion, and the other region as a dark portion. It is said to be a pattern light.
  • the pattern light 15 emitted by the light source device 11 is not limited to a pattern having a dot shape in the bright portion, and may be a grid pattern or the like.
  • the pattern light 15 emitted from the light source device 11 irradiates a predetermined object OBJ as an object to be measured via the light emitting side optical system 12. Then, the pattern light 15 is reflected by the predetermined object OBJ and is incident on the distance measuring device 21 via the light receiving side optical system 22.
  • the ranging device 21 receives the pattern light 15 that is reflected by the object OBJ and is incident.
  • the distance measuring device 21 generates a detection signal according to the amount of light of the received pattern light 15. Then, the distance measuring device 21 calculates and outputs a distance measuring value which is a measured value of the distance to a predetermined object OBJ based on the detection signal.
  • FIG. 2 is a block diagram showing a configuration example of the light source device 11 and the distance measuring device 21.
  • the light source device 11 has a light emitting source 31 and a light source driving unit 32.
  • the distance measuring device 21 includes a synchronization control unit 41, a distance measuring sensor 42, and a signal processing unit 43.
  • the light emitting source 31 is composed of a light source array in which a plurality of light emitting elements such as a VCSEL (Vertical Cavity Surface Emitting Laser) are arranged in a plane direction.
  • the light emitting source 31 emits light while being modulated at a timing corresponding to the light emission timing signal supplied from the synchronous control unit 41 of the distance measuring device 21 according to the control of the light source driving unit 32, and emits the pattern light 15 as the irradiation light. Irradiate a predetermined object OBJ.
  • the irradiation light for example, infrared light having a wavelength in the range of about 850 nm to 940 nm is used.
  • the light source driving unit 32 is composed of, for example, a laser driver or the like, and causes each light emitting element of the light emitting source 31 to emit light in response to a light emitting timing signal supplied from the synchronous control unit 41.
  • the synchronization control unit 41 of the distance measuring device 21 generates a light emission timing signal that controls the timing at which each light emitting element of the light emitting source 31 emits light, and supplies it to the light source driving unit 32. Further, the synchronization control unit 41 also supplies a light emission timing signal to the distance measurement sensor 42 in order to drive the distance measurement sensor 42 in accordance with the light emission timing of the light emission source 31.
  • the light emission timing signal for example, a rectangular wave signal (pulse signal) that turns on and off at a predetermined frequency (for example, 20 MHz, 50 MHz, etc.) can be used.
  • the emission timing signal is not limited to a rectangular wave as long as it is a periodic signal, and may be, for example, a sine wave.
  • the pattern light 15 emitted from the light source device 11 is reflected by the predetermined object OBJ by the pixel array unit 63 (FIG. 6) in which a plurality of pixels 71 (FIG. 6) are two-dimensionally arranged in a matrix. Receives the reflected light. Then, the distance measuring sensor 42 supplies the detection signal according to the received amount of the received reflected light to the signal processing unit 43 in pixel units of the pixel array unit 63.
  • the signal processing unit 43 calculates the distance measurement value, which is the distance from the distance measurement sensor 42 to the predetermined object OBJ, based on the detection signal supplied from the distance measurement sensor 42. More specifically, the signal processing unit 43 calculates the distance measurement value (first distance measurement value) in the ToF method, which is the first detection method, and also distance measurement in the SL method, which is the second detection method. Calculate the value (second distance measurement value). Then, the signal processing unit 43 calculates the fusion distance measurement value by fusing the first distance measurement value and the second distance measurement value, and outputs the fusion distance measurement value to the outside.
  • the ToF method is a method in which the time from the irradiation of the spot SP, which is the bright part of the pattern light 15, to the reception as the reflected light is detected as the phase difference, and the distance is calculated based on the phase difference.
  • the method is a method in which the position of the spot SP, which is the bright part of the pattern light 15, is detected, and the distance is calculated by the principle of triangulation using the detected position of the spot light.
  • the ranging system 1 configured as described above irradiates the object OBJ with the pattern light 15 having two types of brightness, light and dark, as shown in the pattern light 15 of FIG. 1, and is the first detection method, ToF.
  • the first distance measurement value by the method and the second distance measurement value by the SL method, which is the second detection method, are calculated, and the fusion distance measurement is performed from the first distance measurement value and the second distance measurement value. Calculate the value and output it to the outside.
  • the synchronization control unit 13 that generates a light emission timing signal may be provided not on the distance measuring device 21 side but on the light source device 11 side.
  • the distance measurement value D [mm] corresponding to the distance from the distance measurement device 21 to the object OBJ can be calculated by the following equation (1).
  • ⁇ t in the equation (1) is the time until the pattern light 15 emitted from the light emitting source 31 is reflected by the object OBJ and enters the distance measuring sensor 42, and c represents the speed of light.
  • pulsed light that repeatedly turns on and off at a predetermined frequency f (modulation frequency) as shown in FIG. 3 is adopted.
  • One cycle T of pulsed light is 1 / f.
  • the reflected light is detected out of phase according to the time ⁇ t from the light emitting source 31 to the distance measuring sensor 42.
  • the time ⁇ t can be calculated by the following equation (2).
  • the distance measurement value D from the distance measurement sensor 42 to the object OBJ can be calculated from the equations (1) and (2) by the following equation (3).
  • Each pixel of the pixel array formed on the distance measuring sensor 42 repeats ON / OFF at high speed, and accumulates electric charge only during the ON period.
  • the distance measuring sensor 42 sequentially switches the ON / OFF execution timing of each pixel of the pixel array, accumulates the electric charge at each execution timing, and outputs a detection signal according to the accumulated electric charge.
  • phase 0 degrees phase 90 degrees
  • phase 180 degrees phase 270 degrees.
  • the execution timing of the phase 0 degree is a timing at which the ON timing (light receiving timing) of each pixel of the pixel array is set to the phase of the pulsed light emitted by the light emitting source 31 of the light source device 11, that is, the same phase as the light emitting pattern.
  • the execution timing of the phase 90 degrees is a timing in which the ON timing (light receiving timing) of each pixel of the pixel array is set to a phase 90 degrees behind the pulsed light (light emitting pattern) emitted by the light emitting source 31 of the light source device 11.
  • the execution timing of the phase 180 degrees is a timing in which the ON timing (light receiving timing) of each pixel of the pixel array is set to a phase 180 degrees behind the pulsed light (light emitting pattern) emitted by the light emitting source 31 of the light source device 11.
  • the execution timing of the phase 270 degrees is a timing in which the ON timing (light receiving timing) of each pixel of the pixel array is delayed by 270 degrees from the pulsed light (light emitting pattern) emitted by the light emitting source 31 of the light source device 11.
  • the ranging sensor 42 sequentially switches the light receiving timing in the order of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree, and acquires the received light amount (accumulated charge) of the reflected light at each light receiving timing.
  • the timing at which the reflected light is incident is shaded.
  • phase difference ⁇ can be calculated by the following equation (4) using Q 0 , Q 90 , Q 180 , and Q 270.
  • the distance measurement value D from the distance measurement system 1 to the object OBJ can be calculated.
  • the distance measuring sensor 42 switches the light receiving timing in each pixel of the pixel array in order of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree as described above, and the accumulated charge (charge) in each phase.
  • the detection signals corresponding to Q 0 , charge Q 90 , charge Q 180 , and charge Q 270 ) are sequentially supplied to the signal processing unit 43.
  • the phases are phased, for example, 0 degrees and 180 degrees. It is possible to acquire the detection signals of the two light receiving timings in which are inverted in one frame. In order to acquire a four-phase detection signal having a phase of 0 degrees, a phase of 90 degrees, a phase of 180 degrees, and a phase of 270 degrees, it is sufficient to have a detection signal of two frames.
  • the SL method uses a light source device that projects pattern light and a light receiving sensor (image sensor) that receives the pattern light to search for a pair of a certain position in the projection pattern and the corresponding light receiving sensor position. This is a method of measuring the distance by applying triangulation.
  • the light source device 11 irradiates the object OBJ with a plurality of spots SP arranged at predetermined intervals, but pays attention to one spot SP (hereinafter, referred to as a spot of interest SP) and pays attention to it. It is assumed that the spot SP is detected at a predetermined position P2 in the light receiving region of the distance measuring sensor 42.
  • the position P1 in the projection pattern of the light source device 11 that emitted the spot of interest SP is known in the light source device 11. Further, the positional relationship between the light source device 11 and the distance measurement sensor 42 including the baseline distance BL between the light source principal point (projection center) of the light source device 11 and the sensor principal point (light receiving center) of the distance measurement sensor 42 is also known. Is. Therefore, using the position P1, the position P2, and the baseline distance BL, the distance measurement value D corresponding to the distance from the distance measurement device 21 to the object OBJ can be calculated by the principle of triangulation.
  • the ranging sensor 42 receives a plurality of spot SPs of the pattern light 15, if it is known which spot SP of the plurality of spots SP emitted by the light source device 11 corresponds to each of the received spot SPs. For each of the plurality of spots SP, the distance measurement value D from the distance measurement device 21 to the object OBJ can be calculated.
  • the pattern light 15 emitted by the light source device 11 of the distance measuring system 1 emits the spot SP of the light emitting source 31 when the distance measuring sensor 42 receives a predetermined spot SP at a predetermined position in the light receiving region. It is a pattern in which a plurality of spot SPs are arranged so that (position) can be specified.
  • the position of the spot SP received as reflected light by the distance measuring sensor 42 moves in a predetermined trajectory within the light receiving region according to the distance to the object OBJ, but the trajectory of each spot SP is different. If it does not overlap with the locus of the spot SP, the spot SP (position) of the light emitting source 31 that emitted the spot SP can be identified based on the position of the spot SP received by the distance measuring sensor 42.
  • the pattern light 15 is a dot pattern in which a plurality of spot SPs are arranged at sufficiently sparse intervals so that the positions of the spot SPs detected by the distance measuring sensor 42 do not overlap with the other spot SPs.
  • the bright portion detected in the light receiving region and the bright portion of the light emitting source 31 that emits the bright portion can be identified for each of the plurality of bright portions, so that the bright portion (dot) is shown in FIG. It does not have to be regularly arranged.
  • the pattern light 15 may be a pattern in which dots (spot SPs) as bright areas are irregularly arranged as shown in A of FIG.
  • spot SPs dots
  • the correspondence between the light receiving position and the light emitting position can be detected by utilizing the feature of the irregular arrangement of dots. It is also possible to calculate the distance measurement value by the Structured Light method by utilizing the characteristics of the irregular arrangement of dots.
  • the pattern light 15 does not have to have a dot in the bright part.
  • a grid pattern in which a grid pattern in which an arbitrary place of a 3x3 square grid is set as a bright part may be arranged may be used.
  • it is possible to specify which grid pattern has received light by arranging the bright part of the 3x3 square grid, so that the position of the grid pattern in the light receiving region of the ranging sensor 42 and its position. It is possible to detect the correspondence with the position of the light emitting source 31 that emits the grid pattern.
  • the ToF method requires a detection signal of two frames as described above, but the SL method uses a detection signal of one frame to determine the distance. Since it can be calculated, the distance measurement value D can be calculated using the detection signal of either one of the two frames used in the ToF method.
  • FIG. 6 is a block diagram showing a configuration example of the distance measuring sensor 42.
  • the distance measuring sensor 42 includes a timing control unit 61, a row scanning circuit 62, a pixel array unit 63, a plurality of AD (Analog to Digital) conversion units 64, a column scanning circuit 65, and a signal processing unit 66.
  • a pixel array unit 63 a plurality of pixels 71 are two-dimensionally arranged in a matrix in the row direction and the column direction.
  • the row direction means the arrangement direction of the pixels 71 in the horizontal direction
  • the column direction means the arrangement direction of the pixels 71 in the vertical direction.
  • the row direction is the horizontal direction in the figure
  • the column direction is the vertical direction in the figure.
  • the timing control unit 61 is composed of, for example, a timing generator that generates various timing signals, and generates various timing signals in synchronization with the light emission timing signal supplied from the synchronization control unit 41 (FIG. 2). It is supplied to the row scanning circuit 62, the AD conversion unit 64, and the column scanning circuit 65. That is, the timing control unit 61 controls the drive timing of the row scanning circuit 62, the AD conversion unit 64, and the column scanning circuit 65.
  • the row scanning circuit 62 is composed of, for example, a shift register, an address decoder, or the like, and drives each pixel 71 of the pixel array unit 63 at the same time for all pixels or in units of rows.
  • the pixel 71 receives the reflected light under the control of the row scanning circuit 62, and outputs a detection signal (pixel signal) at a level corresponding to the amount of light received. Details of the pixel 71 will be described later in FIG.
  • the pixel drive line 72 is wired along the horizontal direction for each pixel row, and the vertical signal line 73 is wired along the vertical direction for each pixel row.
  • the pixel drive line 72 transmits a drive signal for driving when reading a detection signal from the pixel 71.
  • the pixel drive line 72 is shown as one wiring, but it is actually composed of a plurality of wirings.
  • the vertical signal line 73 is also shown as one wire, it is actually composed of a plurality of wires.
  • the AD conversion unit 64 is provided for each column, and synchronizes with the clock signal CK supplied from the timing control unit 61, and transmits the detection signal supplied from each pixel 71 of the corresponding column via the vertical signal line 73. AD conversion.
  • the AD conversion unit 64 outputs the AD-converted detection signal (detection data) to the signal processing unit 66 under the control of the column scanning circuit 65.
  • the column scanning circuit 65 sequentially selects the AD conversion unit 64 and outputs the detection data after the AD conversion to the signal processing unit 66.
  • the signal processing unit 66 has at least an arithmetic processing function, and performs various signal processing such as arithmetic processing based on the detection data output from the AD conversion unit 64.
  • FIG. 7 is a block diagram showing a configuration example of the pixel 71.
  • the pixel 71 includes a photoelectric conversion element 81, a transfer switch 82, charge storage units 83 and 84, and selection switches 85 and 86.
  • the photoelectric conversion element 81 is composed of, for example, a photodiode, and photoelectrically converts the reflected light to generate an electric charge.
  • the transfer switch 82 transfers the charge generated by the photoelectric conversion element 81 to either the charge storage unit 83 or 84 based on the transfer signal SEL_FD.
  • the transfer switch 82 is composed of, for example, a pair of MOS (Metal-Oxide-Semiconductor) transistors.
  • the charge storage units 83 and 84 are composed of, for example, a floating diffusion layer, accumulate charges, and generate a voltage corresponding to the accumulated charges.
  • the charges accumulated in the charge storage units 83 and 84 can be reset based on the reset signal RST.
  • the selection switch 85 selects the output of the charge storage unit 83 according to the selection signal RD_FD1.
  • the selection switch 86 selects the output of the charge storage unit 84 according to the selection signal RD_FD2. That is, when the selection switch 85 or 86 is turned on by the selection signal RD_FD1 or RD_FD2, a voltage signal corresponding to the stored charge of the turned-on charge storage unit 83 or 84 is transmitted as a detection signal via the vertical signal line 73. Is output to the AD conversion unit 64.
  • Each of the selection switches 85 and 86 is composed of, for example, a MOS transistor.
  • the wiring for transmitting the transfer signal SEL_FD, the reset signal RST, and the selection signals RD_FD1 and RD_FD2 corresponds to the pixel drive line 72 in FIG.
  • the pixel 71 transfers the charges generated by the photoelectric conversion element 81 to the first tap and the second tap, respectively.
  • the pixel 71 transfers the charges generated by the photoelectric conversion element 81 to the first tap and the second tap, respectively.
  • detection signals of two light receiving timings whose phases are inverted, such as phase 0 degree and phase 180 degree, in one frame.
  • two light reception timing detection signals having a phase of 90 degrees and a phase of 270 degrees can be acquired.
  • the detection signals of the first tap and the second tap are added up to obtain a pixel signal of one pixel. Similar to the image sensor, the position of each spot SP in the light receiving region can be specified based on the pixel signal (luminance information) of each pixel.
  • the total processing of the detection signals of the first tap and the second tap can be performed by the signal processing unit 43 in the subsequent stage.
  • the distance measuring sensor 42 can operate with the same drive for both the ToF method and the SL method.
  • FIG. 8 is a perspective view showing a chip configuration example of the distance measuring device 21.
  • the distance measuring device 21 can be composed of one chip in which the first die (board) 91 and the second die (board) 92 are laminated.
  • the first die 91 is configured with, for example, a synchronization control unit 41 and a distance measuring sensor 42
  • the second die 92 is configured with, for example, a signal processing unit 43.
  • the distance measuring device 21 may be composed of three layers in which another logic die is laminated in addition to the first die 91 and the second die 92, or may be composed of four or more layers of dies (boards). It may be configured.
  • the first chip 95 as the distance measuring sensor 42 and the second chip 96 as the signal processing unit 43 are connected to the relay board 97. It can be formed and constructed on top.
  • the synchronization control unit 41 is included in either the first chip 95 or the second chip 96.
  • step S1 the synchronization control unit 41 generates a light emission timing signal and supplies it to the light source driving unit 32 and the distance measuring sensor 42.
  • step S2 the light emitting source 31 emits the pattern light 15 while modulating according to the light emitting timing signal according to the control of the light source driving unit 32.
  • the light emitting source 31 irradiates the predetermined object OBJ with the irradiation light in synchronization with the light emission timing signal.
  • the pattern light 15 is, for example, infrared light having a dot pattern having a plurality of spots SP arranged at predetermined intervals as bright areas and other areas as dark areas, as shown in FIG.
  • step S3 the distance measuring sensor 42 receives the reflected light reflected by the predetermined object OBJ from the pattern light 15 emitted from the light emitting source 31. Then, the distance measuring sensor 42 supplies the detection signals of the first tap and the second tap corresponding to the received amount of the received reflected light to the signal processing unit 43 for each pixel 71 of the pixel array unit 63. .. As described above, since the ToF method requires a detection signal of 2 frames, at least 2 frames of the detection signal are acquired and supplied to the signal processing unit 43.
  • step S4 the signal processing unit 43 uses the detection signals of the first tap and the second tap of each pixel 71 of the pixel array unit 63 supplied from the distance measuring sensor 42 to obtain a plurality of pattern lights 15. For each spot SP, the distance measurement value D1 by the ToF method is calculated and stored in the internal memory as the first distance measurement value D1.
  • the detection signal of each tap (first tap or second tap) corresponding to one spot SP, which is the bright part of the pattern light 15, is that the light of the spot SP is directly reflected by the predetermined object OBJ.
  • the light indirectly reflected by a place other than the predetermined object OBJ for example, a wall or another object
  • the detection signal of each tap of the pixel 71 on which the light of one spot SP is incident is (d A). It is represented by + I A).
  • the detection signal I A corresponding only to the indirect reflection component is detected.
  • the signal processing unit 43 the detection signal of the pixel 71 corresponding to the bright portion and (d A + I A), and a detection signal I A pixel 71 corresponding to the dark part, the light receiving area of the distance measuring sensor 42 (pixel array portion 63 ), Estimate the position of each spot SP.
  • the signal processing unit 43 calculates the difference between the detection signal (d A + I A ) of the pixel 71 corresponding to the bright part and the detection signal I A of the pixel 71 corresponding to the dark part, and removes the indirect reflection component.
  • detection signals of the pixels 71 corresponding to the bright portion d a calculates the (d a + I a) -I a.
  • the signal processing unit 43 uses the detection signals d A of the first tap and the second tap from which the indirect reflection component has been removed for each of the spot SPs estimated in the light receiving region, and places them according to the equation (4).
  • the phase difference ⁇ is calculated, and the first ranging value D1 is calculated.
  • step S5 the signal processing unit 43 uses the detection signals of the first tap and the second tap of each pixel 71 of the pixel array unit 63 supplied from the distance measuring sensor 42 to generate a plurality of pattern lights 15. For each spot SP, the distance measurement value D2 by the SL method is calculated and stored in the internal memory as the second distance measurement value D2.
  • the signal processing unit 43 adds up the detection signal of the first tap and the detection signal of the second tap for each pixel 71 of the pixel array unit 63 supplied from the distance measuring sensor 42 to generate a pixel signal. To do.
  • the signal processing unit 43 in the light receiving region (pixel array unit 63) specifies a pixel signal d B of the pixel 71 corresponding to the bright portion, and a pixel signal I B of the pixel 71 corresponding to the dark portion of each spot SP Identify the location.
  • the pixel 71 corresponding to the bright portion can be detected by using the luminance information (pixel signal) of the pixel unit or a plurality of pixel units such as 3x3.
  • the signal processing unit 43 acquires the position of the spot SP on the light source device 11 side corresponding to the position of each specified spot SP from the internal memory.
  • the position of the spot SP on the light source device 11 side corresponding to the position of each spot SP is associated with, for example, the position of each spot SP and is stored in advance in the internal memory.
  • the signal processing unit 43 determines from the position of the spot SP that has received light, the position on the light source device 11 side corresponding to the position, and the baseline distance BL between the sensor principal point (light receiving center) of the distance measuring sensor 42.
  • the distance measurement value D2 is calculated for each of the plurality of spots SP of the pattern light 15 according to the principle of triangulation.
  • the baseline distance BL is also stored in the internal memory in advance.
  • step S6 the signal processing unit 43 executes the fusion distance measurement value calculation process for each of the plurality of spots SP of the received pattern light 15.
  • the fusion distance measurement value calculation process is a process of calculating the fusion distance measurement value DF by fusing the first distance measurement value D1 by the ToF method and the second distance measurement value D2 by the SL method.
  • FIG. 12 any of the first to third fusion distance measurement value calculation processes shown in FIG. 12 is executed.
  • FIG. 10 is a flowchart of the first fusion distance measurement value calculation process that can be executed as the fusion distance measurement value calculation process in step S9 of FIG.
  • the signal processing unit 43 is based on the first distance measurement value D1 obtained by the ToF method and the second distance measurement value D2 obtained by the SL method.
  • calculate the value of the distance evaluation index (distance evaluation index value).
  • the distance evaluation index value is a value for determining whether the distance to the object OBJ as the object to be measured is a short distance or a long distance.
  • the first distance measurement value D1 and the second distance measurement value D1 and the second distance evaluation index value are used.
  • the average value with the distance measurement value D2 can be used.
  • the distance measurement value of one of the predetermined detection methods that is, the first distance measurement value D1 or the second distance measurement value D2 may be simply adopted as the distance evaluation index value.
  • step S22 the signal processing unit 43 determines whether the distance evaluation index value indicates a short distance. For example, when the distance evaluation index value is equal to or less than a predetermined value, the signal processing unit 43 determines that the distance is short.
  • step S22 If it is determined in step S22 that the distance evaluation index value indicates a short distance, the process proceeds to step S23, and the signal processing unit 43 fuses the second distance measurement value D2 obtained by the SL method. Determine the distance measurement value DF.
  • step S22 determines whether the distance evaluation index value does not indicate a short distance, that is, indicates a long distance. If it is determined in step S22 that the distance evaluation index value does not indicate a short distance, that is, indicates a long distance, the process proceeds to step S24, and the signal processing unit 43 is obtained by the ToF method.
  • the first distance measurement value D1 is determined as the fusion distance measurement value DF.
  • the distance evaluation using the first distance measurement value D1 obtained by the ToF method and the second distance measurement value D2 obtained by the SL method is used.
  • the fusion distance measurement value DF is calculated based on the value of the index.
  • FIG. 11 is a flowchart of a second fusion distance measurement value calculation process that can be executed as the fusion distance measurement value calculation process in step S9 of FIG.
  • step S31 the signal processing unit 43 determines a detection method having a small error based on the error evaluation indexes of each of the ToF method and the SL method.
  • the error evaluation indexes of the ToF method and the SL method as shown in FIG. 12 are calculated in advance and stored in the internal memory.
  • the error evaluation indexes E1 and E2 are error evaluation functions using the distance (distance measurement value) as a parameter
  • the error evaluation index E1 represents the error evaluation index by the ToF method
  • the error evaluation index E2 is the error by the SL method.
  • the error evaluation indexes E1 and E2 of the ToF method and the ToF method shown in FIG. 12 the error of the second distance measurement value D2 by the SL method is smaller and the distance measurement value is larger than the predetermined value at a short distance. Then, the error is smaller in the first distance measurement value D1 by the ToF method.
  • the signal processing unit 43 calculates an error based on the error evaluation indexes E1 and E2 of the ToF method and the SL method, respectively, and determines the detection method having the smaller error.
  • step S32 the signal processing unit 43 determines the distance measurement value (first distance measurement value D1 or second distance measurement value D2) of the detection method having a small error as the fusion distance measurement value DF.
  • the fusion distance measurement value DF is calculated based on the error evaluation indexes of the ToF method and the SL method respectively.
  • FIG. 13 is a flowchart of a third fusion distance measurement value calculation process that can be executed as the fusion distance measurement value calculation process in step S9 of FIG.
  • step S41 the signal processing unit 43 determines whether the ToF method detection signal (detection data after AD conversion) indicates saturation.
  • the signal processing unit 43 determines that the detection signal of the ToF method indicates saturation when the detection signal (detection data after AD conversion) of the spot SP portion is equal to or higher than a predetermined threshold value corresponding to the saturation level. ..
  • step S41 If it is determined in step S41 that the detection signal of the ToF method indicates saturation, the process proceeds to step S42, and the signal processing unit 43 fuses and measures the second ranging value D2 obtained by the SL method. Determine the distance value DF.
  • step S41 determines whether the detection signal of the ToF method does not show saturation. If it is determined in step S41 that the detection signal of the ToF method does not show saturation, the process proceeds to step S43, and the signal processing unit 43 determines the first ranging value D1 obtained by the ToF method. Determine the fusion distance measurement value DF.
  • the fusion distance measurement value DF is calculated based on whether or not the detection signal of the spot SP portion of the ToF method indicates saturation.
  • step S9 of FIG. 9 the fusion distance measurement value DF is calculated by any of the first to third fusion distance measurement value calculation processes, and is output from the distance measurement device 21 to the subsequent stage, and the distance measurement process of FIG. 9 is performed. finish.
  • the fusion distance measurement value DF is the first distance measurement value obtained by the ToF method. Whether it corresponds to D1 or the second ranging value D2 obtained by the SL method differs for each spot SP.
  • the spot SP marked “ToF” in the vicinity of each spot SP indicates that the first ranging value D1 obtained by the ToF method was selected as the fusion ranging value DF, and “ The spot SP marked "SL" indicates that the second ranging value D2 obtained by the SL method was selected as the fusion ranging value DF.
  • each of the first tap and the second tap obtained by driving the distance measurement sensor 42 having two charge storage units (2 taps) in one pixel.
  • the detection signal it is possible to calculate two distance measurement values, a first distance measurement value D1 by the ToF method and a second distance measurement value D2 by the SL method. Then, the distance measuring system 1 can output a fusion distance measuring value DF in which these two distance measuring values are fused.
  • spot light having a high emission intensity is used as the irradiation light emitted by the light source device 11, and the pixels are saturated in an object having a short distance. Even in the case, the distance can be measured. That is, a wide measurement range can be realized by using spot light.
  • the distance measurement value can be calculated only in the position of the spot SP, which is the bright part of the pattern light 15, in the light receiving region of the distance measurement sensor 42. Although there are merits of improving the above and extending the measurement range, the resolution is lower than that when the irradiation light is flat light.
  • the distance measuring system 1 uses the fusion distance measurement value DF of the plurality of spot SPs to execute a high-resolution processing for calculating the fusion distance measurement value DF of the position of the dark part between the plurality of spot SPs, and measures the resolution.
  • the resolution of the distance value can be improved.
  • the signal processing unit 43 calculates the fusion distance measurement value DF of the dark part between the plurality of bright parts by using the bilateral upsampling method as the high resolution processing.
  • the signal processing unit 43 detects (positions) of a plurality of spots SP (positions) which are bright parts around the position P of the dark part to be calculated. In the example of FIG. 15, the positions of the three spots SP A , SP B , and SP C are detected.
  • the similarity w PQ ( ⁇ ) can be a function such that the smaller the difference ⁇ of the detected signals, the greater the similarity.
  • a Gaussian function or the like can be used as the function of the similarity w ( ⁇ ).
  • the signal processing unit 43 sets the positions A', B', and C'of the dark areas around the three spots SP A , SP B , and SP C, respectively, and the position P of the dark area to be calculated.
  • the similarity w PQ ( ⁇ ) as the weights of the surrounding dark areas A', B', and C'
  • the fusion distance measurement of the dark area position P to be calculated is performed by the weighted average of the following equation (5). to calculate the value DF P.
  • the signal processing unit 43 has the dark areas (positions A', B) around each of the plurality of bright areas (spots SP A , SP B , and SP C ) around the position P of the dark area to be calculated.
  • C' detection signal I Q 'detection signals I a' of), I B ', and I C' a) as a reference, the method of bilateral upsampling, fusion distance value DF P position P Is calculated.
  • the resolution of the distance measurement value can be improved by using the detection signal of the same frame as a reference.
  • FIG. 16 is a flowchart of the distance measurement process including the above-mentioned high resolution process.
  • steps S61 to S66 of FIG. 16 are the same as the processes of steps S1 to S6 shown in FIG. 9, the description thereof will be omitted.
  • step S67 the signal processing unit 43 refers to the position P of the dark portion to be calculated at the position between the plurality of spots SP constituting the pattern light 15, and is located in the dark portion around the bright portion around the position P.
  • the detection signal I as a reference, the fusion distance measurement value DFP at position P is calculated by the bilateral upsampling method.
  • a plurality of positions P of the dark portion to be calculated can be set within the region of the pattern light 15.
  • Modification example of distance measurement system> 17 and 18 show other configuration examples of the ranging system to which the present technology is applied.
  • the distance measuring system 1 of FIG. 17 differs from the distance measuring system 1 of FIG. 1 in that the light source device 11 of FIG. 1 is replaced with the light source device 111, and other points are configured in the same manner as in FIG. There is.
  • the light source device 111 irradiates not only the pattern light 15 having two types of brightness of the bright part and the dark part, but also the flat light 121 in which the emission brightness of the entire substantially rectangular area is uniform within a predetermined brightness range by switching. Can be done.
  • the light source device 111 irradiates the predetermined object OBJ with the pattern light 15 at the first timing. Further, the light source device 111 irradiates the predetermined object OBJ with the plane light 121 at the second timing.
  • the first timing and the second timing may be arbitrary different timings, or may be timings having a predetermined predetermined interval. That is, the irradiation of the pattern light 15 and the irradiation of the plane light 121 may be individually performed independently or continuously with a fixed time relationship.
  • the distance measuring device 21 receives the reflected light reflected by the object OBJ and calculates the fusion distance measuring value DF for each of the plurality of spots SP. Further, when the plane light 121 is irradiated from the light source device 111, the distance measuring device 21 receives the reflected light reflected by the object OBJ, and measures each pixel 71 of the pixel array unit 63 by the ToF method. The distance value D3 is calculated.
  • the distance measuring device 21 can output the calculated fusion distance measuring value DF or the distance measuring value D3 as a measurement result.
  • the distance measuring device 21 executes the distance measuring process including the high resolution processing
  • the distance measuring device 21 uses the distance measuring value D3 measured by the plane light 121 as a reference and uses the bilateral upsampling method to perform the fusion distance measuring of the position P. It calculates the value DF P, it is possible to output the fusion distance value after resolution enhancement.
  • the light source device 111 does not have a configuration in which the pattern light 15 and the plane light 121 are switched and irradiated, but a light source device for irradiating the plane light 121 is additionally provided separately from the light source device 11 for irradiating the pattern light 15. May be good.
  • the distance measuring system 1 of FIG. 18 further includes an RGB sensor 141 capable of capturing visible light including RGB wavelengths and a light receiving side optical system 142 that collects visible light incident on the RGB sensor 141. Unlike the distance system 1, other points are configured in the same manner as in FIG.
  • the RGB sensor 141 has the same imaging range as the light receiving range of the ranging device 21, captures the same subject as the ranging device 21, and generates an captured image.
  • the position is determined by the bilateral upsampling method with reference to the image captured by the RGB sensor 141 at the same time as the distance measuring device 21.
  • the fusion distance measurement value of P DFP can be calculated and the fusion distance measurement value after the high resolution processing can be output.
  • the fusion distance measurement value DFP of the position P is calculated by the bilateral upsampling method using the captured image obtained at the same time as the distance measurement device 21 by the IR sensor as a reference, and after the high resolution processing. It is possible to output the fusion distance measurement value of.
  • a configuration having 142 is also possible.
  • the distance measurement value D3 measured by the plane light 121 can be used as a reference, or the fusion distance measurement value after the high resolution processing can be output by using the image captured by the RGB sensor 141 as a reference. it can.
  • the distance measuring system 1 described above can be mounted on electronic devices such as smartphones, tablet terminals, mobile phones, personal computers, game machines, television receivers, wearable terminals, digital still cameras, and digital video cameras.
  • FIG. 19 is a block diagram showing a configuration example of a smartphone as an electronic device equipped with the distance measuring system 1.
  • the control unit 210 has functions as an application processing unit 221 and an operation system processing unit 222 by executing a program by the CPU.
  • the distance measuring system 1 of FIG. 1 is applied to the distance measuring module 202.
  • the distance measuring module 202 is arranged in front of the smartphone 201, and by performing distance measurement for the user of the smartphone 201, the depth value of the surface shape of the user's face, hand, finger, etc. is measured as a distance measurement result. Can be output as.
  • the image pickup device 203 is arranged in front of the smartphone 201, and by taking an image of the user of the smartphone 201 as a subject, the image taken by the user is acquired. Although not shown, the image pickup device 203 may be arranged on the back surface of the smartphone 201.
  • the display 204 displays an operation screen for performing processing by the application processing unit 221 and the operation system processing unit 222, an image captured by the image pickup device 203, and the like.
  • the speaker 205 and the microphone 206 for example, output the voice of the other party and collect the voice of the user when making a call by the smartphone 201.
  • the communication module 207 communicates via the communication network.
  • the sensor unit 208 senses speed, acceleration, proximity, etc., and the touch panel 209 acquires a touch operation by the user on the operation screen displayed on the display 204.
  • the application processing unit 221 performs processing for providing various services by the smartphone 201.
  • the application processing unit 221 can create a face by computer graphics that virtually reproduces the user's facial expression based on the depth supplied from the distance measuring module 202, and can perform a process of displaying the face on the display 204.
  • the application processing unit 221 can perform a process of creating, for example, three-dimensional shape data of an arbitrary three-dimensional object based on the depth supplied from the distance measuring module 202.
  • the operation system processing unit 222 performs processing for realizing the basic functions and operations of the smartphone 201.
  • the operation system processing unit 222 can perform a process of authenticating the user's face and unlocking the smartphone 201 based on the depth value supplied from the distance measuring module 202.
  • the operation system processing unit 222 performs, for example, a process of recognizing a user's gesture based on the depth value supplied from the distance measuring module 202, and performs a process of inputting various operations according to the gesture. Can be done.
  • the smartphone 201 configured in this way, by applying the distance measuring system 1 described above, for example, a depth map can be generated with high accuracy and high speed. As a result, the smartphone 201 can detect the distance measurement information more accurately.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
  • FIG. 20 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
  • the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
  • the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects the in-vehicle information.
  • a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
  • the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (ADvanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. It is possible to perform cooperative control for the purpose of.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
  • the audio image output unit 12052 transmits the output signal of at least one of the audio and the image to the output device capable of visually or audibly notifying the passenger or the outside of the vehicle of the information.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
  • FIG. 21 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example.
  • the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
  • the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
  • the images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 21 shows an example of the photographing range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
  • the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
  • the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more.
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
  • the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
  • pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
  • the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the above is an example of a vehicle control system to which the technology according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to the vehicle exterior information detection unit 12030 and the vehicle interior information detection unit 12040 among the configurations described above.
  • processing for recognizing the driver's gesture is performed, and various types according to the gesture (for example, It can perform operations on audio systems, navigation systems, air conditioning systems) and detect the driver's condition more accurately.
  • the distance measurement by the distance measurement system 1 can be used to recognize the unevenness of the road surface and reflect it in the control of the suspension.
  • the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
  • the configurations described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit).
  • a configuration other than the above may be added to the configuration of each device (or each processing unit).
  • a part of the configuration of one device (or processing unit) may be included in the configuration of another device (or other processing unit). ..
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • the present technology can have the following configurations.
  • a ranging sensor that receives the reflected light that is reflected by an object and has two types of brightness, a bright part and a dark part that are emitted from the light source device.
  • the time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned
  • the position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value.
  • a distance measuring device including a signal processing unit for calculating a fusion distance measuring value obtained by fusing the second distance measuring value and the second distance measuring value.
  • the signal processing unit detects the phase difference based on the difference between the detection signal of the pixel corresponding to the bright part of the pattern light and the detection signal of the pixel corresponding to the dark part of the pattern light (1).
  • the distance measuring device described in. (3)
  • the signal processing unit detects the position of the bright portion of the pattern light, acquires the position of the detected light source device in the bright portion, and calculates the second ranging value by the principle of triangulation.
  • the distance measuring device according to (1) or (2).
  • the signal processing unit calculates the fusion distance measurement value based on the value of the distance evaluation index using the first distance measurement value and the second distance measurement value (1) to (3).
  • the distance measuring device according to any one of.
  • the signal processing unit calculates the fusion distance measurement value based on the error indexes of each of the first method for calculating the first distance measurement value and the second method for calculating the second distance measurement value.
  • the distance measuring device according to any one of (1) to (3) above.
  • the signal processing unit calculates the fusion distance measurement value based on whether or not the detection signal of the pixel at the time of detecting the phase difference shows saturation. Described in any one of (1) to (3). Distance measuring device.
  • the pattern light is a dot pattern in which dots are arranged at regular or irregular predetermined intervals.
  • the signal processing unit further executes a high resolution process for calculating a fusion distance measuring value at the position of the dark part.
  • the signal processing unit calculates a fusion distance measuring value of the position of the dark part between a plurality of bright parts by using a bilateral upsampling method.
  • the signal processing unit calculates the fusion distance measurement value of the position of the dark portion by using the detection signal received by the distance measurement sensor as a reference for the plane light emitted by the light source device before or after the pattern light.
  • the distance measuring device according to any one of (1) to (11).
  • (14) The distance measuring device according to any one of (1) to (11) above, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part with reference to a detection signal received by the RGB sensor.
  • the distance measuring device The time from when the pattern light having two kinds of brightness of the bright part and the dark part irradiated from the light source device is irradiated to when it is reflected by the object and received as the reflected light is detected as the phase difference, and is based on the phase difference.
  • the first ranging value which is the distance to the object, is calculated, the position of the bright part of the pattern light is detected, and the detected position of the bright part is used to reach the object by the principle of triangular measurement.
  • a measuring method in which a second distance measurement value, which is a distance, is calculated, and a fusion distance measurement value obtained by fusing the first distance measurement value and the second distance measurement value is calculated.
  • a light source device that irradiates a pattern light having two types of brightness, a bright part and a dark part, It is equipped with a ranging device that receives the reflected light that is reflected by the object and returned.
  • the distance measuring device is A distance measuring sensor that receives the reflected light and The time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned The position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value is obtained.
  • a distance measuring system including a signal processing unit that calculates a fusion distance measuring value in which the second distance measuring value is fused. (17) The distance measuring system according to (16), wherein the light source device irradiates the pattern light and a flat light whose emission brightness of the entire area is uniform within a predetermined brightness range by switching.

Abstract

This technology pertains to a distance measurement device, a measurement method, and a distance measurement system that make it possible to implement a broad measurement range using spot light. This distance measurement device comprises: a distance measurement sensor that receives reflection light when pattern light has been reflected back by an object, the pattern light having two types of brightness, i.e., bright sections and dark sections, and being emitted from a light source device; and a signal processing unit that detects, as a phase difference, a period of time from when the pattern light was emitted to when this light is received as reflection light, calculates a first distance measurement value that is the distance to the object on the basis of the phase difference, detects the position of the bright sections in the pattern light, calculates a second distance measurement value that is the distance to the object according to the principle of triangulation using the detected positions of the bright sections, and calculates a combined distance measurement value in which the first distance measurement value and the second distance measurement value are combined. The present technology can be applied to, for example, a distance measurement system or the like that measures the distance to a subject.

Description

測距装置、測定方法、および、測距システムDistance measuring device, measuring method, and distance measuring system
 本技術は、測距装置、測定方法、および、測距システムに関し、特に、スポット光を用いて、広い測定範囲を実現できるようにした測距装置、測定方法、および、測距システムに関する。 The present technology relates to a distance measuring device, a measuring method, and a distance measuring system, and more particularly to a distance measuring device, a measuring method, and a distance measuring system capable of realizing a wide measurement range by using spot light.
 近年、半導体技術の進歩により、物体までの距離を測定する測距モジュールの小型化が進んでいる。これにより、例えば、通信機能を備えた小型の情報処理装置である、いわゆるスマートフォンなどのモバイル端末に測距モジュールを搭載することが実現されている。 In recent years, advances in semiconductor technology have led to the miniaturization of distance measuring modules that measure the distance to an object. As a result, for example, it has been realized that a distance measuring module is mounted on a mobile terminal such as a so-called smartphone, which is a small information processing device having a communication function.
 測距モジュールにおける測距方法としては、例えば、Indirect ToF(Time of Flight)方式や、Structured Light方式などがある。Indirect ToF方式は、光を物体に向かって照射して物体の表面で反射してくる光を検出し、その光の飛行時間を測定した測定値に基づいて物体までの距離を算出する方式である(例えば、特許文献1参照)。Structured Light方式は、パターン光を物体に向かって照射し、物体の表面におけるパターンの歪みを撮像した画像に基づいて物体までの距離を算出する方式である。 Examples of the distance measuring method in the distance measuring module include the Indirect ToF (Time of Flight) method and the Structured Light method. The IndirectToF method is a method that irradiates light toward an object, detects the light reflected on the surface of the object, and calculates the distance to the object based on the measured value obtained by measuring the flight time of the light. (See, for example, Patent Document 1). The Structured Light method is a method of irradiating a pattern light toward an object and calculating the distance to the object based on an image obtained by capturing the distortion of the pattern on the surface of the object.
 Indirect ToF方式において、物体に向かって照射する照射光として、例えば、平面光とスポット光が有り得る。平面光は、広域な範囲に光を照射することができるが、発光強度が分散されるため、距離の遠い物体は測定することができない(測定範囲が短い)。これに対して、スポット光は、光パワーの密度を高くできるため、より遠くまで測定することができ、マルチパスの影響も抑えることができるため、精度の向上が期待できる。 In the Indirect ToF method, for example, planar light and spot light can be used as the irradiation light to irradiate the object. Plane light can irradiate a wide range of light, but since the emission intensity is dispersed, it is not possible to measure an object at a long distance (the measurement range is short). On the other hand, spot light can be measured farther because the density of light power can be increased, and the influence of multipath can be suppressed, so that improvement in accuracy can be expected.
特開2017-150893号公報JP-A-2017-150893
 しかしながら、照射光をスポット光にすると、距離の近い物体に対しては受光量が大きくなるため、画素が飽和してしまい、測定できない場合がある。 However, if the irradiation light is spot light, the amount of light received will be large for objects with a short distance, so the pixels will be saturated and measurement may not be possible.
 本技術は、このような状況に鑑みてなされたものであり、スポット光を用いて、広い測定範囲を実現できるようにするものである。 This technology was made in view of such a situation, and makes it possible to realize a wide measurement range by using spot light.
 本技術の第1の側面の測距装置は、
 光源装置から照射された明部と暗部の2種類の輝度を有するパターン光が物体で反射されて返ってきた反射光を受光する測距センサと、
 前記パターン光が照射されてから前記反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する信号処理部と
 を備える。
The distance measuring device on the first side of the present technology is
A ranging sensor that receives the reflected light that is reflected by an object and has two types of brightness, a bright part and a dark part that are emitted from the light source device.
The time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned The position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value. It is provided with a signal processing unit that calculates a fusion distance measurement value that is a fusion of the second distance measurement value and the second distance measurement value.
 本技術の第2の側面の測定方法は、
 測距装置が、
 光源装置から照射された明部と暗部の2種類の輝度を有するパターン光が照射されてから物体で反射されて反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する。
The measurement method of the second aspect of the present technology is
The distance measuring device
The time from when the pattern light having two kinds of brightness of the bright part and the dark part irradiated from the light source device is irradiated to when it is reflected by the object and received as the reflected light is detected as the phase difference, and is based on the phase difference. The first ranging value, which is the distance to the object, is calculated, the position of the bright part of the pattern light is detected, and the detected position of the bright part is used to reach the object by the principle of triangular survey. A second distance measurement value, which is a distance, is calculated, and a fusion distance measurement value obtained by fusing the first distance measurement value and the second distance measurement value is calculated.
 本技術の第3の側面の測距システムは、
 明部と暗部の2種類の輝度を有するパターン光を照射する光源装置と、
 前記パターン光が物体で反射されて返ってきた反射光を受光する測距装置と
 を備え、
 前記測距装置は、
  前記反射光を受光する測距センサと、
  前記パターン光が照射されてから前記反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する信号処理部と
 を備える。
The ranging system of the third aspect of this technology is
A light source device that irradiates a pattern light having two types of brightness, a bright part and a dark part,
It is equipped with a ranging device that receives the reflected light that is reflected by the object and returned.
The distance measuring device is
A distance measuring sensor that receives the reflected light and
The time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned The position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value. It is provided with a signal processing unit that calculates a fusion distance measurement value that is a fusion of the second distance measurement value and the second distance measurement value.
 本技術の第1乃至第3の側面においては、光源装置から照射された明部と暗部の2種類の輝度を有するパターン光が物体で反射されて返ってきた反射光が受光され、前記パターン光が照射されてから前記反射光として受光されるまでの時間が位相差として検出され、前記位相差に基づいて前記物体までの距離である第1の測距値が算出されるとともに、前記パターン光の明部の位置が検出され、検出された前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値が算出され、前記第1の測距値と前記第2の測距値とを融合した融合測距値が算出される。 In the first to third aspects of the present technology, the pattern light having two types of brightness, the bright part and the dark part, emitted from the light source device is received by the object and the reflected light returned is received, and the pattern light is received. The time from when the light is irradiated to when it is received as the reflected light is detected as a phase difference, and the first ranging value, which is the distance to the object, is calculated based on the phase difference, and the pattern light is used. The position of the bright part is detected, and the second distance measurement value, which is the distance to the object, is calculated by the principle of triangulation using the detected position of the bright part, and is combined with the first distance measurement value. A fusion distance measurement value that is fused with the second distance measurement value is calculated.
 測距装置及び測距システムは、独立した装置であっても良いし、他の装置に組み込まれるモジュールであっても良い。 The distance measuring device and the distance measuring system may be independent devices or may be modules incorporated in other devices.
本技術を適用した測距システムの概略構成例を示す図である。It is a figure which shows the schematic configuration example of the distance measurement system to which this technology is applied. 光源装置と測距装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of a light source device and a distance measuring device. ToF方式による測距の方法を説明する図である。It is a figure explaining the method of distance measurement by the ToF method. SL方式による測距の方法を説明する図である。It is a figure explaining the method of distance measurement by the SL method. パターン光のその他の例を説明する図である。It is a figure explaining another example of a pattern light. 測距センサの構成例を示すブロック図である。It is a block diagram which shows the structural example of the distance measuring sensor. 画素の構成例を示すブロック図である。It is a block diagram which shows the structural example of a pixel. 測距装置のチップ構成例を示す斜視図である。It is a perspective view which shows the chip structure example of the distance measuring device. 距離測定処理を説明するフローチャートである。It is a flowchart explaining the distance measurement process. 第1の融合測距値算出処理のフローチャートである。It is a flowchart of the 1st fusion distance measurement value calculation process. 第2の融合測距値算出処理のフローチャートである。It is the flowchart of the 2nd fusion distance measurement value calculation processing. 誤差評価指標を説明する図である。It is a figure explaining an error evaluation index. 第3の融合測距値算出処理のフローチャートである。It is a flowchart of the 3rd fusion distance measurement value calculation process. 融合測距値算出処理で得られる結果を説明する図である。It is a figure explaining the result obtained by the fusion distance measurement value calculation process. 信号処理部が実行する高解像度化処理を説明する図である。It is a figure explaining the high-resolution processing performed by a signal processing unit. 高解像度化処理を含む距離測定処理説明するフローチャートである。It is a flowchart explaining the distance measurement process including the high-resolution process. 本技術を適用した測距システムのその他の構成例を示す図である。It is a figure which shows the other configuration example of the distance measurement system to which this technique is applied. 本技術を適用した測距システムのその他の構成例を示す図である。It is a figure which shows the other configuration example of the distance measurement system to which this technique is applied. 本技術を適用した電子機器の構成例を示すブロック図である。It is a block diagram which shows the structural example of the electronic device to which this technology is applied. 車両制御システムの概略的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the schematic structure of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。It is explanatory drawing which shows an example of the installation position of the vehicle exterior information detection unit and the imaging unit.
 以下、添付図面を参照しながら、本技術を実施するための形態(以下、実施の形態という)について説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。説明は以下の順序で行う。
1.測距システムの概略構成例
2.ToF方式とSL方式による測距方法の説明
3.測距センサの構成例
4.測距装置のチップ構成例
5.距離測定処理
6.高解像度化処理
7.距離測定処理
8.測距システムの変形例
9.電子機器の構成例
10.移動体への応用例
Hereinafter, embodiments for carrying out the present technology (hereinafter referred to as embodiments) will be described with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration are designated by the same reference numerals, so that duplicate description will be omitted. The explanation will be given in the following order.
1. 1. Schematic configuration example of the ranging system 2. Explanation of distance measurement method by ToF method and SL method 3. Configuration example of distance measuring sensor 4. Example of chip configuration of distance measuring device 5. Distance measurement process 6. High resolution processing 7. Distance measurement process 8. Modification example of the ranging system 9. Configuration example of electronic device 10. Application example to mobile
<1.測距システムの概略構成例>
 図1は、本技術を適用した測距システムの概略構成例を示している。
<1. Schematic configuration example of distance measurement system>
FIG. 1 shows a schematic configuration example of a distance measuring system to which the present technology is applied.
 図1に示される測距システム1は、光源装置11、発光側光学系12、測距装置21、および、受光側光学系22を有する。 The distance measuring system 1 shown in FIG. 1 includes a light source device 11, a light emitting side optical system 12, a distance measuring device 21, and a light receiving side optical system 22.
 光源装置11は、明部と暗部の2種類の輝度を有するパターン光15を生成して照射する。パターン光15は、例えば、図1に示されるような、規則的または不規則な所定の間隔で配置されたドット(丸)形状からなる複数のスポットSPを明部とし、その他の領域を暗部とするパターン光とされる。なお、光源装置11が照射するパターン光15は、後述するように、明部をドット形状としたパターンに限定されず、格子パターン等でもよい。光源装置11から照射されたパターン光15は、発光側光学系12を介して被測定物としての所定の物体OBJに照射される。そして、パターン光15は、所定の物体OBJで反射され、受光側光学系22を介して、測距装置21に入射される。 The light source device 11 generates and irradiates the pattern light 15 having two types of brightness, a bright part and a dark part. The pattern light 15 has, for example, a plurality of spots SP having a dot (circle) shape arranged at regular or irregular predetermined intervals as shown in FIG. 1 as a bright portion, and the other region as a dark portion. It is said to be a pattern light. As will be described later, the pattern light 15 emitted by the light source device 11 is not limited to a pattern having a dot shape in the bright portion, and may be a grid pattern or the like. The pattern light 15 emitted from the light source device 11 irradiates a predetermined object OBJ as an object to be measured via the light emitting side optical system 12. Then, the pattern light 15 is reflected by the predetermined object OBJ and is incident on the distance measuring device 21 via the light receiving side optical system 22.
 測距装置21は、物体OBJで反射されて入射されてくるパターン光15を受光する。測距装置21は、受光したパターン光15の光量に応じた検出信号を生成する。そして、測距装置21は、検出信号に基づいて、所定の物体OBJまでの距離の測定値である測距値を算出し、出力する。 The ranging device 21 receives the pattern light 15 that is reflected by the object OBJ and is incident. The distance measuring device 21 generates a detection signal according to the amount of light of the received pattern light 15. Then, the distance measuring device 21 calculates and outputs a distance measuring value which is a measured value of the distance to a predetermined object OBJ based on the detection signal.
 図2は、光源装置11と測距装置21の構成例を示すブロック図である。 FIG. 2 is a block diagram showing a configuration example of the light source device 11 and the distance measuring device 21.
 光源装置11は、発光源31と、光源駆動部32とを有する。測距装置21は、同期制御部41、測距センサ42、および、信号処理部43を有する。 The light source device 11 has a light emitting source 31 and a light source driving unit 32. The distance measuring device 21 includes a synchronization control unit 41, a distance measuring sensor 42, and a signal processing unit 43.
 発光源31は、例えば、VCSEL(Vertical Cavity Surface Emitting Laser:垂直共振器面発光レーザ)等の発光素子を平面方向に複数配列した光源アレイで構成される。発光源31は、光源駆動部32の制御にしたがい、測距装置21の同期制御部41から供給される発光タイミング信号に応じたタイミングで変調しながら発光して、照射光としてのパターン光15を所定の物体OBJに照射する。照射光には、例えば、波長が約850nmから940nmの範囲の赤外光が用いられる。 The light emitting source 31 is composed of a light source array in which a plurality of light emitting elements such as a VCSEL (Vertical Cavity Surface Emitting Laser) are arranged in a plane direction. The light emitting source 31 emits light while being modulated at a timing corresponding to the light emission timing signal supplied from the synchronous control unit 41 of the distance measuring device 21 according to the control of the light source driving unit 32, and emits the pattern light 15 as the irradiation light. Irradiate a predetermined object OBJ. As the irradiation light, for example, infrared light having a wavelength in the range of about 850 nm to 940 nm is used.
 光源駆動部32は、例えば、レーザドライバ等で構成され、同期制御部41から供給される発光タイミング信号に応じて、発光源31の各発光素子を発光させる。 The light source driving unit 32 is composed of, for example, a laser driver or the like, and causes each light emitting element of the light emitting source 31 to emit light in response to a light emitting timing signal supplied from the synchronous control unit 41.
 測距装置21の同期制御部41は、発光源31の各発光素子が発光するタイミングを制御する発光タイミング信号を生成し、光源駆動部32に供給する。また、同期制御部41は、発光源31の発光のタイミングに合わせて測距センサ42を駆動させるために、発光タイミング信号を測距センサ42にも供給する。発光タイミング信号には、例えば、所定の周波数(例えば、20MHz、50MHzなど)でオンオフする矩形波の信号(パルス信号)を用いることができる。なお、発光タイミング信号は、周期信号であれば、矩形波に限定されず、例えば、サイン波などでもよい。 The synchronization control unit 41 of the distance measuring device 21 generates a light emission timing signal that controls the timing at which each light emitting element of the light emitting source 31 emits light, and supplies it to the light source driving unit 32. Further, the synchronization control unit 41 also supplies a light emission timing signal to the distance measurement sensor 42 in order to drive the distance measurement sensor 42 in accordance with the light emission timing of the light emission source 31. As the light emission timing signal, for example, a rectangular wave signal (pulse signal) that turns on and off at a predetermined frequency (for example, 20 MHz, 50 MHz, etc.) can be used. The emission timing signal is not limited to a rectangular wave as long as it is a periodic signal, and may be, for example, a sine wave.
 測距センサ42は、複数の画素71(図6)が行列状に2次元配置された画素アレイ部63(図6)により、光源装置11から照射されたパターン光15が所定の物体OBJで反射された反射光を受光する。そして、測距センサ42は、受光した反射光の受光量に応じた検出信号を、画素アレイ部63の画素単位で信号処理部43に供給する。 In the distance measuring sensor 42, the pattern light 15 emitted from the light source device 11 is reflected by the predetermined object OBJ by the pixel array unit 63 (FIG. 6) in which a plurality of pixels 71 (FIG. 6) are two-dimensionally arranged in a matrix. Receives the reflected light. Then, the distance measuring sensor 42 supplies the detection signal according to the received amount of the received reflected light to the signal processing unit 43 in pixel units of the pixel array unit 63.
 信号処理部43は、測距センサ42から供給される検出信号に基づいて、測距センサ42から所定の物体OBJまでの距離である測距値を算出する。より具体的には、信号処理部43は、第1の検出方式であるToF方式における測距値(第1の測距値)を算出するとともに、第2の検出方式であるSL方式における測距値(第2の測距値)を算出する。そして、信号処理部43は、第1の測距値と第2の測距値を融合した融合測距値を算出し、外部に出力する。ToF方式は、パターン光15の明部であるスポットSPが照射されてから、反射光として受光されるまでの時間を位相差として検出し、位相差に基づいて距離を算出する方式であり、SL方式は、パターン光15の明部であるスポットSPの位置を検出し、検出したスポット光の位置を用いて三角測量の原理により、距離を算出する方式である。 The signal processing unit 43 calculates the distance measurement value, which is the distance from the distance measurement sensor 42 to the predetermined object OBJ, based on the detection signal supplied from the distance measurement sensor 42. More specifically, the signal processing unit 43 calculates the distance measurement value (first distance measurement value) in the ToF method, which is the first detection method, and also distance measurement in the SL method, which is the second detection method. Calculate the value (second distance measurement value). Then, the signal processing unit 43 calculates the fusion distance measurement value by fusing the first distance measurement value and the second distance measurement value, and outputs the fusion distance measurement value to the outside. The ToF method is a method in which the time from the irradiation of the spot SP, which is the bright part of the pattern light 15, to the reception as the reflected light is detected as the phase difference, and the distance is calculated based on the phase difference. The method is a method in which the position of the spot SP, which is the bright part of the pattern light 15, is detected, and the distance is calculated by the principle of triangulation using the detected position of the spot light.
 以上のように構成される測距システム1は、図1のパターン光15のような明/暗の2種類の輝度からなるパターン光15を物体OBJに照射し、第1の検出方式であるToF方式による第1の測距値と、第2の検出方式であるSL方式による第2の測距値とを算出し、第1の測距値と第2の測距値とから、融合測距値を算出して外部に出力する。 The ranging system 1 configured as described above irradiates the object OBJ with the pattern light 15 having two types of brightness, light and dark, as shown in the pattern light 15 of FIG. 1, and is the first detection method, ToF. The first distance measurement value by the method and the second distance measurement value by the SL method, which is the second detection method, are calculated, and the fusion distance measurement is performed from the first distance measurement value and the second distance measurement value. Calculate the value and output it to the outside.
 測距システム1において、発光タイミング信号を生成する同期制御部13は、測距装置21側ではなく、光源装置11側に設けるようにしてもよい。 In the distance measuring system 1, the synchronization control unit 13 that generates a light emission timing signal may be provided not on the distance measuring device 21 side but on the light source device 11 side.
<2.ToF方式とSL方式による測距方法の説明>
 次に、図3および図4を参照して、ToF方式とSL方式それぞれの測距の方法について簡単に説明する。
<2. Explanation of distance measurement method by ToF method and SL method>
Next, with reference to FIGS. 3 and 4, the distance measuring methods of the ToF method and the SL method will be briefly described.
 最初に、図3を参照して、ToF方式による測距の方法を説明する。 First, the method of distance measurement by the ToF method will be described with reference to FIG.
 測距装置21から物体OBJまでの距離に相当する測距値D[mm]は、以下の式(1)で計算することができる。
Figure JPOXMLDOC01-appb-M000001
The distance measurement value D [mm] corresponding to the distance from the distance measurement device 21 to the object OBJ can be calculated by the following equation (1).
Figure JPOXMLDOC01-appb-M000001
 式(1)のΔtは、発光源31から出射されたパターン光15が物体OBJで反射されて測距センサ42に入射するまでの時間であり、cは、光速を表す。 Δt in the equation (1) is the time until the pattern light 15 emitted from the light emitting source 31 is reflected by the object OBJ and enters the distance measuring sensor 42, and c represents the speed of light.
 発光源31から照射されるパターン光15には、図3に示されるような、所定の周波数f(変調周波数)で高速にオンオフを繰り返すパルス光が採用される。パルス光の1周期Tは1/fとなる。測距センサ42では、発光源31から測距センサ42に到達するまでの時間Δtに応じて、反射光(受光パターン)の位相がずれて検出される。この発光パターンと受光パターンとの位相のずれ量(位相差)をφとすると、時間Δtは、下記の式(2)で算出することができる。
Figure JPOXMLDOC01-appb-M000002
As the pattern light 15 emitted from the light emitting source 31, pulsed light that repeatedly turns on and off at a predetermined frequency f (modulation frequency) as shown in FIG. 3 is adopted. One cycle T of pulsed light is 1 / f. In the distance measuring sensor 42, the reflected light (light receiving pattern) is detected out of phase according to the time Δt from the light emitting source 31 to the distance measuring sensor 42. Assuming that the amount of phase shift (phase difference) between the light emitting pattern and the light receiving pattern is φ, the time Δt can be calculated by the following equation (2).
Figure JPOXMLDOC01-appb-M000002
 したがって、測距センサ42から物体OBJまでの測距値Dは、式(1)と式(2)とから、下記の式(3)で算出することができる。
Figure JPOXMLDOC01-appb-M000003
Therefore, the distance measurement value D from the distance measurement sensor 42 to the object OBJ can be calculated from the equations (1) and (2) by the following equation (3).
Figure JPOXMLDOC01-appb-M000003
 次に、上述の位相差φの算出手法について説明する。 Next, the above-mentioned calculation method of the phase difference φ will be described.
 測距センサ42に形成された画素アレイの各画素は、高速にON/OFFを繰り返し、ON期間のみの電荷を蓄積する。 Each pixel of the pixel array formed on the distance measuring sensor 42 repeats ON / OFF at high speed, and accumulates electric charge only during the ON period.
 測距センサ42は、画素アレイの各画素のON/OFFの実行タイミングを順次切り替えて、各実行タイミングにおける電荷を蓄積し、蓄積電荷に応じた検出信号を出力する。 The distance measuring sensor 42 sequentially switches the ON / OFF execution timing of each pixel of the pixel array, accumulates the electric charge at each execution timing, and outputs a detection signal according to the accumulated electric charge.
 ON/OFFの実行タイミングには、たとえば、位相0度、位相90度、位相180度、および、位相270度の4種類がある。 There are four types of ON / OFF execution timings, for example, phase 0 degrees, phase 90 degrees, phase 180 degrees, and phase 270 degrees.
 位相0度の実行タイミングは、画素アレイの各画素のONタイミング(受光タイミング)を、光源装置11の発光源31が出射するパルス光の位相、すなわち発光パターンと同じ位相とするタイミングである。 The execution timing of the phase 0 degree is a timing at which the ON timing (light receiving timing) of each pixel of the pixel array is set to the phase of the pulsed light emitted by the light emitting source 31 of the light source device 11, that is, the same phase as the light emitting pattern.
 位相90度の実行タイミングは、画素アレイの各画素のONタイミング(受光タイミング)を、光源装置11の発光源31が出射するパルス光(発光パターン)から90度遅れた位相とするタイミングである。 The execution timing of the phase 90 degrees is a timing in which the ON timing (light receiving timing) of each pixel of the pixel array is set to a phase 90 degrees behind the pulsed light (light emitting pattern) emitted by the light emitting source 31 of the light source device 11.
 位相180度の実行タイミングは、画素アレイの各画素のONタイミング(受光タイミング)を、光源装置11の発光源31が出射するパルス光(発光パターン)から180度遅れた位相とするタイミングである。 The execution timing of the phase 180 degrees is a timing in which the ON timing (light receiving timing) of each pixel of the pixel array is set to a phase 180 degrees behind the pulsed light (light emitting pattern) emitted by the light emitting source 31 of the light source device 11.
 位相270度の実行タイミングは、画素アレイの各画素のONタイミング(受光タイミング)を、光源装置11の発光源31が出射するパルス光(発光パターン)から270度遅れた位相とするタイミングである。 The execution timing of the phase 270 degrees is a timing in which the ON timing (light receiving timing) of each pixel of the pixel array is delayed by 270 degrees from the pulsed light (light emitting pattern) emitted by the light emitting source 31 of the light source device 11.
 測距センサ42は、例えば、位相0度、位相90度、位相180度、位相270度の順番で受光タイミングを順次切り替え、各受光タイミングにおける反射光の受光量(蓄積電荷)を取得する。図3では、各位相の受光タイミング(ONタイミング)において、反射光が入射されるタイミングに斜線が付されている。 The ranging sensor 42 sequentially switches the light receiving timing in the order of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree, and acquires the received light amount (accumulated charge) of the reflected light at each light receiving timing. In FIG. 3, in the light receiving timing (ON timing) of each phase, the timing at which the reflected light is incident is shaded.
 図3に示されるように、受光タイミングを、位相0度、位相90度、位相180度、および、位相270度としたときに蓄積された電荷を、それぞれ、Q0、Q90、Q180、および、Q270とすると、位相差φは、Q0、Q90、Q180、および、Q270を用いて、下記の式(4)で算出することができる。
Figure JPOXMLDOC01-appb-M000004
As shown in FIG. 3, when the light receiving timing is set to phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree, the accumulated charges are Q 0 , Q 90 , Q 180 , respectively. And, assuming that Q 270 , the phase difference φ can be calculated by the following equation (4) using Q 0 , Q 90 , Q 180 , and Q 270.
Figure JPOXMLDOC01-appb-M000004
 式(4)で算出された位相差φを上記の式(3)に入力することにより、測距システム1から物体OBJまでの測距値Dを算出することができる。 By inputting the phase difference φ calculated by the equation (4) into the above equation (3), the distance measurement value D from the distance measurement system 1 to the object OBJ can be calculated.
 測距センサ42は、画素アレイの各画素において、以上のように受光タイミングを、位相0度、位相90度、位相180度、および、位相270度と順番に切り替え、各位相における蓄積電荷(電荷Q0、電荷Q90、電荷Q180、および、電荷Q270)に応じた検出信号を、順次、信号処理部43に供給する。なお、後述するように、画素アレイの各画素に電荷蓄積部を2つ設け、2つの電荷蓄積部に交互に電荷を蓄積させることにより、例えば、位相0度と位相180度のように、位相が反転した2つの受光タイミングの検出信号を1フレームで取得することができる。位相0度、位相90度、位相180度、および、位相270度の4位相の検出信号を取得するためには、2フレームの検出信号があればよい。 The distance measuring sensor 42 switches the light receiving timing in each pixel of the pixel array in order of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree as described above, and the accumulated charge (charge) in each phase. The detection signals corresponding to Q 0 , charge Q 90 , charge Q 180 , and charge Q 270 ) are sequentially supplied to the signal processing unit 43. As will be described later, by providing two charge storage units in each pixel of the pixel array and alternately accumulating charges in the two charge storage units, the phases are phased, for example, 0 degrees and 180 degrees. It is possible to acquire the detection signals of the two light receiving timings in which are inverted in one frame. In order to acquire a four-phase detection signal having a phase of 0 degrees, a phase of 90 degrees, a phase of 180 degrees, and a phase of 270 degrees, it is sufficient to have a detection signal of two frames.
 次に、図4を参照して、SL方式による測距の方法を説明する。 Next, the method of distance measurement by the SL method will be described with reference to FIG.
 SL方式は、パターン光を投影する光源装置と、そのパターン光を受光する受光センサ(イメージセンサ)とを用いて、投影パターン内のある位置とそれに対応する受光センサの位置との対を探すことにより、三角測量を適用して測距する方式である。 The SL method uses a light source device that projects pattern light and a light receiving sensor (image sensor) that receives the pattern light to search for a pair of a certain position in the projection pattern and the corresponding light receiving sensor position. This is a method of measuring the distance by applying triangulation.
 光源装置11は、図1に示したように所定の間隔で配置された複数のスポットSPを物体OBJへ照射するが、1つのスポットSP(以下、注目スポットSPと称する。)に注目し、注目スポットSPが、測距センサ42の受光領域の所定の位置P2で検出されたとする。 As shown in FIG. 1, the light source device 11 irradiates the object OBJ with a plurality of spots SP arranged at predetermined intervals, but pays attention to one spot SP (hereinafter, referred to as a spot of interest SP) and pays attention to it. It is assumed that the spot SP is detected at a predetermined position P2 in the light receiving region of the distance measuring sensor 42.
 このとき、注目スポットSPを発した光源装置11の投影パターン内の位置P1は、光源装置11において既知である。また、光源装置11の光源主点(投影中心)と、測距センサ42のセンサ主点(受光中心)との間のベースライン距離BLを含む光源装置11と測距センサ42の位置関係も既知である。したがって、位置P1と、位置P2と、ベースライン距離BLとを用いて、三角測量の原理により、測距装置21から物体OBJまでの距離に相当する測距値Dを算出することができる。 At this time, the position P1 in the projection pattern of the light source device 11 that emitted the spot of interest SP is known in the light source device 11. Further, the positional relationship between the light source device 11 and the distance measurement sensor 42 including the baseline distance BL between the light source principal point (projection center) of the light source device 11 and the sensor principal point (light receiving center) of the distance measurement sensor 42 is also known. Is. Therefore, using the position P1, the position P2, and the baseline distance BL, the distance measurement value D corresponding to the distance from the distance measurement device 21 to the object OBJ can be calculated by the principle of triangulation.
 したがって、測距センサ42が、パターン光15の複数のスポットSPを受光したとき、受光した各スポットSPが、光源装置11が発した複数のスポットSPのどのスポットSPと対応するかが分かれば、複数のスポットSPそれぞれについて、測距装置21から物体OBJまでの測距値Dを算出することができる。 Therefore, when the ranging sensor 42 receives a plurality of spot SPs of the pattern light 15, if it is known which spot SP of the plurality of spots SP emitted by the light source device 11 corresponds to each of the received spot SPs. For each of the plurality of spots SP, the distance measurement value D from the distance measurement device 21 to the object OBJ can be calculated.
 そこで、測距システム1の光源装置11が発光するパターン光15は、測距センサ42が、受光領域の所定の位置で所定のスポットSPを受光したとき、それを発した発光源31のスポットSP(の位置)を特定できるように、複数のスポットSPが配置されたパターンとされている。 Therefore, the pattern light 15 emitted by the light source device 11 of the distance measuring system 1 emits the spot SP of the light emitting source 31 when the distance measuring sensor 42 receives a predetermined spot SP at a predetermined position in the light receiving region. It is a pattern in which a plurality of spot SPs are arranged so that (position) can be specified.
 具体的には、測距センサ42で反射光として受光されるスポットSPの位置は、物体OBJまでの距離に応じて受光領域内で所定の軌跡で移動するが、各スポットSPの軌跡が他のスポットSPの軌跡とオーバーラップしなければ、測距センサ42が受光したスポットSPの位置に基づいて、それを発した発光源31のスポットSP(の位置)を同定することができる。言い換えれば、パターン光15は、測距センサ42で検出されるスポットSPの位置が他のスポットSPとオーバーラップしない十分に疎な間隔で複数のスポットSPが配置されたドットパターンとされている。 Specifically, the position of the spot SP received as reflected light by the distance measuring sensor 42 moves in a predetermined trajectory within the light receiving region according to the distance to the object OBJ, but the trajectory of each spot SP is different. If it does not overlap with the locus of the spot SP, the spot SP (position) of the light emitting source 31 that emitted the spot SP can be identified based on the position of the spot SP received by the distance measuring sensor 42. In other words, the pattern light 15 is a dot pattern in which a plurality of spot SPs are arranged at sufficiently sparse intervals so that the positions of the spot SPs detected by the distance measuring sensor 42 do not overlap with the other spot SPs.
 なお、パターン光15は、複数の明部それぞれについて、受光領域内で検出された明部と、それを発した発光源31の明部が特定できればよいので、明部(ドット)が図1のように規則的に配列されている必要はない。 As for the pattern light 15, it is sufficient that the bright portion detected in the light receiving region and the bright portion of the light emitting source 31 that emits the bright portion can be identified for each of the plurality of bright portions, so that the bright portion (dot) is shown in FIG. It does not have to be regularly arranged.
 例えば、パターン光15は、図5のAに示されるように、明部としてのドット(スポットSP)が不規則に並んだパターンでもよい。この場合、例えば、ドットの不規則配置の特徴を利用して、受光位置と発光位置との対応を検出することができる。また、ドットの不規則配置の特徴を利用して、Structured Light方式による測距値を算出することも可能である。 For example, the pattern light 15 may be a pattern in which dots (spot SPs) as bright areas are irregularly arranged as shown in A of FIG. In this case, for example, the correspondence between the light receiving position and the light emitting position can be detected by utilizing the feature of the irregular arrangement of dots. It is also possible to calculate the distance measurement value by the Structured Light method by utilizing the characteristics of the irregular arrangement of dots.
 あるいはまた、パターン光15は、明部がドットである必要もない。例えば、図5のBに示されるように、3x3の正方格子の任意の場所を明部とした格子パターンを配列させたグリッドパターンでもよい。このようなグリッドパターンの場合、3x3の正方格子の明部の配置により、どの格子パターンを受光したかを特定することができるので、測距センサ42の受光領域内の格子パターンの位置と、その格子パターンを発した発光源31の位置との対応を検出することができる。 Alternatively, the pattern light 15 does not have to have a dot in the bright part. For example, as shown in B of FIG. 5, a grid pattern in which a grid pattern in which an arbitrary place of a 3x3 square grid is set as a bright part may be arranged may be used. In the case of such a grid pattern, it is possible to specify which grid pattern has received light by arranging the bright part of the 3x3 square grid, so that the position of the grid pattern in the light receiving region of the ranging sensor 42 and its position. It is possible to detect the correspondence with the position of the light emitting source 31 that emits the grid pattern.
 測距センサ42の各画素に電荷蓄積部が2つ設けられている場合、ToF方式では上述したように2フレームの検出信号が必要となるが、SL方式では、1フレームの検出信号で距離を算出することができるので、ToF方式で用いた2フレームのどちらか一方の検出信号を用いて測距値Dを算出することができる。 When two charge storage units are provided in each pixel of the distance measuring sensor 42, the ToF method requires a detection signal of two frames as described above, but the SL method uses a detection signal of one frame to determine the distance. Since it can be calculated, the distance measurement value D can be calculated using the detection signal of either one of the two frames used in the ToF method.
<3.測距センサの構成例>
 図6は、測距センサ42の構成例を示すブロック図である。
<3. Configuration example of ranging sensor>
FIG. 6 is a block diagram showing a configuration example of the distance measuring sensor 42.
 測距センサ42は、タイミング制御部61、行走査回路62、画素アレイ部63、複数のAD(Analog to Digital)変換部64、列走査回路65、および、信号処理部66を備える。画素アレイ部63には、複数の画素71が行方向および列方向の行列状に2次元配置されている。ここで、行方向とは、水平方向の画素71の配列方向をいい、列方向とは、垂直方向の画素71の配列方向をいう。行方向は、図中、横方向であり、列方向は図中、縦方向である。 The distance measuring sensor 42 includes a timing control unit 61, a row scanning circuit 62, a pixel array unit 63, a plurality of AD (Analog to Digital) conversion units 64, a column scanning circuit 65, and a signal processing unit 66. In the pixel array unit 63, a plurality of pixels 71 are two-dimensionally arranged in a matrix in the row direction and the column direction. Here, the row direction means the arrangement direction of the pixels 71 in the horizontal direction, and the column direction means the arrangement direction of the pixels 71 in the vertical direction. The row direction is the horizontal direction in the figure, and the column direction is the vertical direction in the figure.
 タイミング制御部61は、例えば、各種のタイミング信号を生成するタイミングジェネレータなどによって構成され、同期制御部41(図2)から供給される発光タイミング信号に同期して、各種のタイミング信号を生成し、行走査回路62、AD変換部64、および、列走査回路65に供給する。すなわち、タイミング制御部61は、行走査回路62、AD変換部64、および、列走査回路65の駆動タイミングを制御する。 The timing control unit 61 is composed of, for example, a timing generator that generates various timing signals, and generates various timing signals in synchronization with the light emission timing signal supplied from the synchronization control unit 41 (FIG. 2). It is supplied to the row scanning circuit 62, the AD conversion unit 64, and the column scanning circuit 65. That is, the timing control unit 61 controls the drive timing of the row scanning circuit 62, the AD conversion unit 64, and the column scanning circuit 65.
 行走査回路62は、例えば、シフトレジスタやアドレスデコーダなどによって構成され、画素アレイ部63の各画素71を全画素同時または行単位等で駆動する。画素71は、行走査回路62の制御に従って反射光を受光し、受光量に応じたレベルの検出信号(画素信号)を出力する。画素71の詳細については、図7で後述する。 The row scanning circuit 62 is composed of, for example, a shift register, an address decoder, or the like, and drives each pixel 71 of the pixel array unit 63 at the same time for all pixels or in units of rows. The pixel 71 receives the reflected light under the control of the row scanning circuit 62, and outputs a detection signal (pixel signal) at a level corresponding to the amount of light received. Details of the pixel 71 will be described later in FIG.
 画素アレイ部63の行列状の画素配列に対して、画素行ごとに画素駆動線72が水平方向に沿って配線され、画素列ごとに垂直信号線73が垂直方向に沿って配線されている。画素駆動線72は、画素71から検出信号を読み出す際の駆動を行うための駆動信号を伝送する。図6では、画素駆動線72が1本の配線として示されているが、実際には複数の配線で構成される。同様に、垂直信号線73も1本の配線として示されているが、実際には複数の配線で構成される。 With respect to the matrix-like pixel array of the pixel array unit 63, the pixel drive line 72 is wired along the horizontal direction for each pixel row, and the vertical signal line 73 is wired along the vertical direction for each pixel row. The pixel drive line 72 transmits a drive signal for driving when reading a detection signal from the pixel 71. In FIG. 6, the pixel drive line 72 is shown as one wiring, but it is actually composed of a plurality of wirings. Similarly, although the vertical signal line 73 is also shown as one wire, it is actually composed of a plurality of wires.
 AD変換部64は、列単位に設けられ、タイミング制御部61から供給されるクロック信号CKに同期して、垂直信号線73を介して、対応する列の各画素71から供給される検出信号をAD変換する。AD変換部64は、列走査回路65の制御に従って、AD変換した検出信号(検出データ)を信号処理部66に出力する。列走査回路65は、AD変換部64を順に選択して、AD変換後の検出データを信号処理部66へ出力させる。 The AD conversion unit 64 is provided for each column, and synchronizes with the clock signal CK supplied from the timing control unit 61, and transmits the detection signal supplied from each pixel 71 of the corresponding column via the vertical signal line 73. AD conversion. The AD conversion unit 64 outputs the AD-converted detection signal (detection data) to the signal processing unit 66 under the control of the column scanning circuit 65. The column scanning circuit 65 sequentially selects the AD conversion unit 64 and outputs the detection data after the AD conversion to the signal processing unit 66.
 信号処理部66は、少なくとも演算処理機能を有し、AD変換部64から出力される検出データに基づいて演算処理等の種々の信号処理を行う。 The signal processing unit 66 has at least an arithmetic processing function, and performs various signal processing such as arithmetic processing based on the detection data output from the AD conversion unit 64.
<画素の構成例>
 図7は、画素71の構成例を示すブロック図である。
<Pixel configuration example>
FIG. 7 is a block diagram showing a configuration example of the pixel 71.
 画素71は、光電変換素子81、転送スイッチ82、電荷蓄積部83および84、並びに、選択スイッチ85および86を備える。 The pixel 71 includes a photoelectric conversion element 81, a transfer switch 82, charge storage units 83 and 84, and selection switches 85 and 86.
 光電変換素子81は、例えば、フォトダイオードで構成され、反射光を光電変換して電荷を生成する。 The photoelectric conversion element 81 is composed of, for example, a photodiode, and photoelectrically converts the reflected light to generate an electric charge.
 転送スイッチ82は、光電変換素子81が生成した電荷を、転送信号SEL_FDに基づいて、電荷蓄積部83および84のいずれかに転送する。この転送スイッチ82は、例えば、一対のMOS(Metal-Oxide-Semiconductor)トランジスタなどで構成される。 The transfer switch 82 transfers the charge generated by the photoelectric conversion element 81 to either the charge storage unit 83 or 84 based on the transfer signal SEL_FD. The transfer switch 82 is composed of, for example, a pair of MOS (Metal-Oxide-Semiconductor) transistors.
 電荷蓄積部83および84は、例えば、浮遊拡散層で構成され、電荷を蓄積して、その蓄積電荷に応じた電圧を生成する。電荷蓄積部83および84に蓄積された電荷は、リセット信号RSTに基づいてリセットすることができる。 The charge storage units 83 and 84 are composed of, for example, a floating diffusion layer, accumulate charges, and generate a voltage corresponding to the accumulated charges. The charges accumulated in the charge storage units 83 and 84 can be reset based on the reset signal RST.
 選択スイッチ85は、選択信号RD_FD1に従って、電荷蓄積部83の出力を選択する。選択スイッチ86は、選択信号RD_FD2に従って、電荷蓄積部84の出力を選択する。すなわち、選択信号RD_FD1またはRD_FD2により、選択スイッチ85または86がオンされると、オンされた電荷蓄積部83または84の蓄積電荷に応じた電圧の信号が、検出信号として、垂直信号線73を介して、AD変換部64へ出力される。選択スイッチ85および86それぞれは、例えば、MOSトランジスタなどで構成される。 The selection switch 85 selects the output of the charge storage unit 83 according to the selection signal RD_FD1. The selection switch 86 selects the output of the charge storage unit 84 according to the selection signal RD_FD2. That is, when the selection switch 85 or 86 is turned on by the selection signal RD_FD1 or RD_FD2, a voltage signal corresponding to the stored charge of the turned-on charge storage unit 83 or 84 is transmitted as a detection signal via the vertical signal line 73. Is output to the AD conversion unit 64. Each of the selection switches 85 and 86 is composed of, for example, a MOS transistor.
 転送信号SEL_FD、リセット信号RST、並びに、選択信号RD_FD1およびRD_FD2を伝送する配線が、図6の画素駆動線72に相当する。 The wiring for transmitting the transfer signal SEL_FD, the reset signal RST, and the selection signals RD_FD1 and RD_FD2 corresponds to the pixel drive line 72 in FIG.
 電荷蓄積部83および84を、それぞれ、第1のタップおよび第2のタップと呼ぶこととすると、ToF方式では、画素71は、光電変換素子81で生成された電荷を、第1のタップおよび第2のタップに交互に電荷を蓄積させることにより、例えば、位相0度と位相180度のように、位相が反転した2つの受光タイミングの検出信号を1フレームで取得することができる。次のフレームでは、位相90度と位相270度の2つの受光タイミングの検出信号が取得できる。 Assuming that the charge storage units 83 and 84 are referred to as a first tap and a second tap, respectively, in the ToF method, the pixel 71 transfers the charges generated by the photoelectric conversion element 81 to the first tap and the second tap, respectively. By alternately accumulating charges on the two taps, it is possible to acquire detection signals of two light receiving timings whose phases are inverted, such as phase 0 degree and phase 180 degree, in one frame. In the next frame, two light reception timing detection signals having a phase of 90 degrees and a phase of 270 degrees can be acquired.
 SL方式として、パターン光15のスポットSPの位置を検出する場合には、第1のタップおよび第2のタップの検出信号が合算され、1画素の画素信号とされる。イメージセンサと同様に、各画素の画素信号(輝度情報)に基づいて、受光領域内の各スポットSPの位置を特定することができる。第1のタップおよび第2のタップの検出信号の合算処理は、後段の信号処理部43で行うことができる。 In the SL method, when the position of the spot SP of the pattern light 15 is detected, the detection signals of the first tap and the second tap are added up to obtain a pixel signal of one pixel. Similar to the image sensor, the position of each spot SP in the light receiving region can be specified based on the pixel signal (luminance information) of each pixel. The total processing of the detection signals of the first tap and the second tap can be performed by the signal processing unit 43 in the subsequent stage.
 したがって、測距センサ42は、ToF方式とSL方式のどちらの検出方式についても、同じ駆動で動作することができる。 Therefore, the distance measuring sensor 42 can operate with the same drive for both the ToF method and the SL method.
<4.測距装置のチップ構成例>
 図8は、測距装置21のチップ構成例を示す斜視図である。
<4. Example of chip configuration of ranging device>
FIG. 8 is a perspective view showing a chip configuration example of the distance measuring device 21.
 測距装置21は、図8のAに示されるように、第1のダイ(基板)91と、第2のダイ(基板)92とが積層された1つのチップで構成することができる。 As shown in A of FIG. 8, the distance measuring device 21 can be composed of one chip in which the first die (board) 91 and the second die (board) 92 are laminated.
 第1のダイ91には、例えば、同期制御部41と測距センサ42が構成され、第2のダイ92には、例えば、信号処理部43が構成されている。 The first die 91 is configured with, for example, a synchronization control unit 41 and a distance measuring sensor 42, and the second die 92 is configured with, for example, a signal processing unit 43.
 なお、測距装置21は、第1のダイ91と第2のダイ92とに加えて、もう1つのロジックダイを積層した3層で構成したり、4層以上のダイ(基板)の積層で構成してもよい。 The distance measuring device 21 may be composed of three layers in which another logic die is laminated in addition to the first die 91 and the second die 92, or may be composed of four or more layers of dies (boards). It may be configured.
 また、測距装置21は、例えば、図8のBに示されるように、測距センサ42としての第1のチップ95と、信号処理部43としての第2のチップ96とを、中継基板97上に形成して構成することができる。同期制御部41は、第1のチップ95または第2のチップ96のいずれかに含んで構成される。 Further, in the distance measuring device 21, for example, as shown in B of FIG. 8, the first chip 95 as the distance measuring sensor 42 and the second chip 96 as the signal processing unit 43 are connected to the relay board 97. It can be formed and constructed on top. The synchronization control unit 41 is included in either the first chip 95 or the second chip 96.
<5.距離測定処理>
 次に、図9のフローチャートを参照して、測距システム1が行う、ToF方式およびSL方式の2つの検出方式を用いた距離測定処理について説明する。この処理は、例えば、測距システム1が組み込まれている上位の装置の制御部から、距離測定開始の指示が供給されたとき開始される。
<5. Distance measurement processing>
Next, with reference to the flowchart of FIG. 9, the distance measurement process using the two detection methods of the ToF method and the SL method performed by the distance measurement system 1 will be described. This process is started, for example, when a control unit of a higher-level device in which the distance measuring system 1 is incorporated supplies an instruction to start distance measurement.
 初めに、ステップS1において、同期制御部41は、発光タイミング信号を生成し、光源駆動部32および測距センサ42に供給する。 First, in step S1, the synchronization control unit 41 generates a light emission timing signal and supplies it to the light source driving unit 32 and the distance measuring sensor 42.
 ステップS2において、発光源31は、光源駆動部32の制御にしたがい、発光タイミング信号に応じて変調しながらパターン光15を発光する。これにより、発光源31は、発光タイミング信号に同期して所定の物体OBJに照射光を照射する。パターン光15は、例えば、図1に示したような、所定の間隔で配置された複数のスポットSPを明部とし、その他の領域を暗部とするドットパターンの赤外光である。 In step S2, the light emitting source 31 emits the pattern light 15 while modulating according to the light emitting timing signal according to the control of the light source driving unit 32. As a result, the light emitting source 31 irradiates the predetermined object OBJ with the irradiation light in synchronization with the light emission timing signal. The pattern light 15 is, for example, infrared light having a dot pattern having a plurality of spots SP arranged at predetermined intervals as bright areas and other areas as dark areas, as shown in FIG.
 ステップS3において、測距センサ42は、発光源31から照射されたパターン光15が所定の物体OBJで反射された反射光を受光する。そして、測距センサ42は、画素アレイ部63の各画素71について、受光した反射光の受光量に応じた第1のタップおよび第2のタップそれぞれの検出信号を、信号処理部43に供給する。なお、上述したように、ToF方式では2フレームの検出信号が必要となるので、少なくとも2フレームの検出信号が取得され、信号処理部43に供給される。 In step S3, the distance measuring sensor 42 receives the reflected light reflected by the predetermined object OBJ from the pattern light 15 emitted from the light emitting source 31. Then, the distance measuring sensor 42 supplies the detection signals of the first tap and the second tap corresponding to the received amount of the received reflected light to the signal processing unit 43 for each pixel 71 of the pixel array unit 63. .. As described above, since the ToF method requires a detection signal of 2 frames, at least 2 frames of the detection signal are acquired and supplied to the signal processing unit 43.
 ステップS4において、信号処理部43は、測距センサ42から供給された画素アレイ部63の各画素71の第1のタップおよび第2のタップそれぞれの検出信号を用いて、パターン光15の複数のスポットSPそれぞれについて、ToF方式による測距値D1を算出し、第1の測距値D1として内部のメモリに記憶する。 In step S4, the signal processing unit 43 uses the detection signals of the first tap and the second tap of each pixel 71 of the pixel array unit 63 supplied from the distance measuring sensor 42 to obtain a plurality of pattern lights 15. For each spot SP, the distance measurement value D1 by the ToF method is calculated and stored in the internal memory as the first distance measurement value D1.
 ステップS4のToF方式による第1の測距値D1の算出方法を説明する。 The calculation method of the first ranging value D1 by the ToF method of step S4 will be described.
 パターン光15の明部である1つのスポットSPに対応する各タップ(第1のタップまたは第2のタップ)の検出信号は、厳密には、スポットSPの光が所定の物体OBJで直接反射された光のほか、所定の物体OBJ以外の場所(例えば、壁や他の物体など)で間接的に反射された光も含まれる。スポットSPの光の直接反射成分による検出信号をdA、間接反射成分による検出信号をIAとすると、1つのスポットSPの光が入射された画素71の各タップの検出信号は、(dA+IA)で表される。スポットSPの光が入射された画素71以外の暗部の画素71では、間接反射成分のみに対応する検出信号IAが検出される。 Strictly speaking, the detection signal of each tap (first tap or second tap) corresponding to one spot SP, which is the bright part of the pattern light 15, is that the light of the spot SP is directly reflected by the predetermined object OBJ. In addition to the emitted light, the light indirectly reflected by a place other than the predetermined object OBJ (for example, a wall or another object) is also included. Assuming that the detection signal by the direct reflection component of the light of the spot SP is d A and the detection signal by the indirect reflection component is I A , the detection signal of each tap of the pixel 71 on which the light of one spot SP is incident is (d A). It is represented by + I A). In the dark pixel 71 other than the pixel 71 in which the light of the spot SP is incident, the detection signal I A corresponding only to the indirect reflection component is detected.
 信号処理部43は、明部に対応する画素71の検出信号(dA+IA)と、暗部に対応する画素71の検出信号IAとから、測距センサ42の受光領域(画素アレイ部63)内の各スポットSPの位置を推定する。 The signal processing unit 43, the detection signal of the pixel 71 corresponding to the bright portion and (d A + I A), and a detection signal I A pixel 71 corresponding to the dark part, the light receiving area of the distance measuring sensor 42 (pixel array portion 63 ), Estimate the position of each spot SP.
 そして、信号処理部43は、明部に対応する画素71の検出信号(dA+IA)と、暗部に対応する画素71の検出信号IAとの差分を算出し、間接反射成分を除いた明部に対応する画素71の検出信号dA=(dA+IA)-IAを算出する。 Then, the signal processing unit 43 calculates the difference between the detection signal (d A + I A ) of the pixel 71 corresponding to the bright part and the detection signal I A of the pixel 71 corresponding to the dark part, and removes the indirect reflection component. detection signals of the pixels 71 corresponding to the bright portion d a = calculates the (d a + I a) -I a.
 信号処理部43は、受光領域内で推定された各スポットSPそれぞれについて、間接反射成分が除去された第1のタップおよび第2のタップの検出信号dAを用いて、式(4)により位相差φを算出し、第1の測距値D1を算出する。 The signal processing unit 43 uses the detection signals d A of the first tap and the second tap from which the indirect reflection component has been removed for each of the spot SPs estimated in the light receiving region, and places them according to the equation (4). The phase difference φ is calculated, and the first ranging value D1 is calculated.
 ステップS5において、信号処理部43は、測距センサ42から供給された画素アレイ部63の各画素71の第1のタップおよび第2のタップそれぞれの検出信号を用いて、パターン光15の複数のスポットSPそれぞれについて、SL方式による測距値D2を算出し、第2の測距値D2として内部のメモリに記憶する。 In step S5, the signal processing unit 43 uses the detection signals of the first tap and the second tap of each pixel 71 of the pixel array unit 63 supplied from the distance measuring sensor 42 to generate a plurality of pattern lights 15. For each spot SP, the distance measurement value D2 by the SL method is calculated and stored in the internal memory as the second distance measurement value D2.
 ステップS5のSL方式による第2の測距値D2の算出方法を説明する。 The calculation method of the second distance measurement value D2 by the SL method in step S5 will be described.
 初めに、信号処理部43は、測距センサ42から供給された画素アレイ部63の各画素71について、第1のタップの検出信号と第2のタップの検出信号を合算し、画素信号を生成する。 First, the signal processing unit 43 adds up the detection signal of the first tap and the detection signal of the second tap for each pixel 71 of the pixel array unit 63 supplied from the distance measuring sensor 42 to generate a pixel signal. To do.
 信号処理部43は、受光領域(画素アレイ部63)において、明部に対応する画素71の画素信号dBと、暗部に対応する画素71の画素信号IBとを特定し、各スポットSPの位置を特定する。明部に対応する画素71は、画素単位または3x3等の複数画素単位の輝度情報(画素信号)を用いて検出することができる。 The signal processing unit 43, in the light receiving region (pixel array unit 63) specifies a pixel signal d B of the pixel 71 corresponding to the bright portion, and a pixel signal I B of the pixel 71 corresponding to the dark portion of each spot SP Identify the location. The pixel 71 corresponding to the bright portion can be detected by using the luminance information (pixel signal) of the pixel unit or a plurality of pixel units such as 3x3.
 次に、信号処理部43は、特定された各スポットSPの位置に対応する光源装置11側のスポットSPの位置を、内部のメモリから取得する。各スポットSPの位置に対応する光源装置11側のスポットSPの位置は、例えば、各スポットSPの位置に対応付けられて、予め内部のメモリに記憶されている。 Next, the signal processing unit 43 acquires the position of the spot SP on the light source device 11 side corresponding to the position of each specified spot SP from the internal memory. The position of the spot SP on the light source device 11 side corresponding to the position of each spot SP is associated with, for example, the position of each spot SP and is stored in advance in the internal memory.
 そして、信号処理部43は、受光したスポットSPの位置と、それに対応する光源装置11側の位置と、測距センサ42のセンサ主点(受光中心)との間のベースライン距離BLとから、三角測量の原理により、パターン光15の複数のスポットSPそれぞれについて、測距値D2を算出する。ベースライン距離BLも、予め内部のメモリに記憶されている。 Then, the signal processing unit 43 determines from the position of the spot SP that has received light, the position on the light source device 11 side corresponding to the position, and the baseline distance BL between the sensor principal point (light receiving center) of the distance measuring sensor 42. The distance measurement value D2 is calculated for each of the plurality of spots SP of the pattern light 15 according to the principle of triangulation. The baseline distance BL is also stored in the internal memory in advance.
 次に、ステップS6において、信号処理部43は、受光したパターン光15の複数のスポットSPそれぞれについて、融合測距値算出処理を実行する。融合測距値算出処理は、ToF方式による第1の測距値D1と、SL方式による第2の測距値D2とを融合した融合測距値DFを算出する処理であり、例えば、図10ないし図12で示される第1ないし第3の融合測距値算出処理のいずれかが実行される。 Next, in step S6, the signal processing unit 43 executes the fusion distance measurement value calculation process for each of the plurality of spots SP of the received pattern light 15. The fusion distance measurement value calculation process is a process of calculating the fusion distance measurement value DF by fusing the first distance measurement value D1 by the ToF method and the second distance measurement value D2 by the SL method. For example, FIG. Alternatively, any of the first to third fusion distance measurement value calculation processes shown in FIG. 12 is executed.
<第1の融合測距値算出処理>
 図10は、図9のステップS9の融合測距値算出処理として実行することができる、第1の融合測距値算出処理のフローチャートである。
<First fusion distance measurement value calculation process>
FIG. 10 is a flowchart of the first fusion distance measurement value calculation process that can be executed as the fusion distance measurement value calculation process in step S9 of FIG.
 この処理では、初めに、ステップS21において、信号処理部43は、ToF方式で得られた第1の測距値D1と、SL方式で得られた第2の測距値D2とに基づいて、距離評価指標の値(距離評価指標値)を算出する。距離評価指標値は、被測定物としての物体OBJまでの距離が近距離であるか長距離であるかを判定するための値であり、例えば、第1の測距値D1と、第2の測距値D2との平均値を用いることができる。あるいはまた、単純に、予め決定した一方の検出方式の測距値、すなわち、第1の測距値D1または第2の測距値D2を、距離評価指標値として採用してもよい。 In this process, first, in step S21, the signal processing unit 43 is based on the first distance measurement value D1 obtained by the ToF method and the second distance measurement value D2 obtained by the SL method. Calculate the value of the distance evaluation index (distance evaluation index value). The distance evaluation index value is a value for determining whether the distance to the object OBJ as the object to be measured is a short distance or a long distance. For example, the first distance measurement value D1 and the second distance measurement value D1 and the second distance evaluation index value are used. The average value with the distance measurement value D2 can be used. Alternatively, the distance measurement value of one of the predetermined detection methods, that is, the first distance measurement value D1 or the second distance measurement value D2 may be simply adopted as the distance evaluation index value.
 ステップS22において、信号処理部43は、距離評価指標値が近距離を示しているかを判定する。例えば、信号処理部43は、距離評価指標値が所定値以下の場合、近距離を示していると判定する。 In step S22, the signal processing unit 43 determines whether the distance evaluation index value indicates a short distance. For example, when the distance evaluation index value is equal to or less than a predetermined value, the signal processing unit 43 determines that the distance is short.
 ステップS22で、距離評価指標値が近距離を示していると判定された場合、処理はステップS23に進み、信号処理部43は、SL方式で得られた第2の測距値D2を、融合測距値DFに決定する。 If it is determined in step S22 that the distance evaluation index value indicates a short distance, the process proceeds to step S23, and the signal processing unit 43 fuses the second distance measurement value D2 obtained by the SL method. Determine the distance measurement value DF.
 一方、ステップS22で、距離評価指標値が近距離を示していない、即ち、長距離を示していると判定された場合、処理はステップS24に進み、信号処理部43は、ToF方式で得られた第1の測距値D1を、融合測距値DFに決定する。 On the other hand, if it is determined in step S22 that the distance evaluation index value does not indicate a short distance, that is, indicates a long distance, the process proceeds to step S24, and the signal processing unit 43 is obtained by the ToF method. The first distance measurement value D1 is determined as the fusion distance measurement value DF.
 以上のように、第1の融合測距値算出処理では、ToF方式で得られた第1の測距値D1と、SL方式で得られた第2の測距値D2とを用いた距離評価指標の値に基づいて、融合測距値DFが算出される。 As described above, in the first fusion distance measurement value calculation process, the distance evaluation using the first distance measurement value D1 obtained by the ToF method and the second distance measurement value D2 obtained by the SL method is used. The fusion distance measurement value DF is calculated based on the value of the index.
<第2の融合測距値算出処理>
 図11は、図9のステップS9の融合測距値算出処理として実行することができる、第2の融合測距値算出処理のフローチャートである。
<Second fusion distance measurement value calculation process>
FIG. 11 is a flowchart of a second fusion distance measurement value calculation process that can be executed as the fusion distance measurement value calculation process in step S9 of FIG.
 この処理では、初めに、ステップS31において、信号処理部43は、ToF方式とSL方式それぞれの誤差評価指標に基づいて、誤差が小さい検出方式を決定する。 In this process, first, in step S31, the signal processing unit 43 determines a detection method having a small error based on the error evaluation indexes of each of the ToF method and the SL method.
 例えば、図12に示されるようなToF方式とSL方式それぞれの誤差評価指標が予め算出され、内部のメモリに記憶されている。誤差評価指標E1およびE2は、距離(測距値)をパラメータとする誤差の評価関数であり、誤差評価指標E1は、ToF方式による誤差評価指標を表し、誤差評価指標E2は、SL方式による誤差評価指標を表す。図12に示されるToF方式とToF方式の誤差評価指標E1とE2では、近距離では、SL方式による第2の測距値D2の方が誤差が小さく、測距値が所定の値よりも大きくなると、ToF方式による第1の測距値D1の方が誤差が小さい。 For example, the error evaluation indexes of the ToF method and the SL method as shown in FIG. 12 are calculated in advance and stored in the internal memory. The error evaluation indexes E1 and E2 are error evaluation functions using the distance (distance measurement value) as a parameter, the error evaluation index E1 represents the error evaluation index by the ToF method, and the error evaluation index E2 is the error by the SL method. Represents an evaluation index. In the error evaluation indexes E1 and E2 of the ToF method and the ToF method shown in FIG. 12, the error of the second distance measurement value D2 by the SL method is smaller and the distance measurement value is larger than the predetermined value at a short distance. Then, the error is smaller in the first distance measurement value D1 by the ToF method.
 信号処理部43は、ToF方式とSL方式それぞれの誤差評価指標E1およびE2に基づいて、誤差を算出し、誤差が小さい方の検出方式を決定する。 The signal processing unit 43 calculates an error based on the error evaluation indexes E1 and E2 of the ToF method and the SL method, respectively, and determines the detection method having the smaller error.
 そして、ステップS32において、信号処理部43は、誤差が小さい検出方式の測距値(第1の測距値D1または第2の測距値D2)を、融合測距値DFに決定する。 Then, in step S32, the signal processing unit 43 determines the distance measurement value (first distance measurement value D1 or second distance measurement value D2) of the detection method having a small error as the fusion distance measurement value DF.
 以上のように、第2の融合測距値算出処理では、ToF方式とSL方式それぞれの誤差評価指標に基づいて、融合測距値DFが算出される。 As described above, in the second fusion distance measurement value calculation process, the fusion distance measurement value DF is calculated based on the error evaluation indexes of the ToF method and the SL method respectively.
<第3の融合測距値算出処理>
 図13は、図9のステップS9の融合測距値算出処理として実行することができる、第3の融合測距値算出処理のフローチャートである。
<Third fusion distance measurement value calculation process>
FIG. 13 is a flowchart of a third fusion distance measurement value calculation process that can be executed as the fusion distance measurement value calculation process in step S9 of FIG.
 この処理では、初めに、ステップS41において、信号処理部43は、ToF方式の検出信号(AD変換後の検出データ)が飽和を示しているかを判定する。信号処理部43は、スポットSP部分の検出信号(AD変換後の検出データ)が、飽和レベルに相当する所定の閾値以上である場合に、ToF方式の検出信号が飽和を示していると判定する。 In this process, first, in step S41, the signal processing unit 43 determines whether the ToF method detection signal (detection data after AD conversion) indicates saturation. The signal processing unit 43 determines that the detection signal of the ToF method indicates saturation when the detection signal (detection data after AD conversion) of the spot SP portion is equal to or higher than a predetermined threshold value corresponding to the saturation level. ..
 ステップS41で、ToF方式の検出信号が飽和を示していると判定された場合、処理はステップS42へ進み、信号処理部43は、SL方式で得られた第2の測距値D2を融合測距値DFに決定する。 If it is determined in step S41 that the detection signal of the ToF method indicates saturation, the process proceeds to step S42, and the signal processing unit 43 fuses and measures the second ranging value D2 obtained by the SL method. Determine the distance value DF.
 一方、ステップS41で、ToF方式の検出信号が飽和を示していないと判定された場合、処理はステップS43へ進み、信号処理部43は、ToF方式で得られた第1の測距値D1を融合測距値DFに決定する。 On the other hand, if it is determined in step S41 that the detection signal of the ToF method does not show saturation, the process proceeds to step S43, and the signal processing unit 43 determines the first ranging value D1 obtained by the ToF method. Determine the fusion distance measurement value DF.
 以上のように、第3の融合測距値算出処理では、ToF方式のスポットSP部分の検出信号が飽和を示すか否かに基づいて、融合測距値DFが算出される。 As described above, in the third fusion distance measurement value calculation process, the fusion distance measurement value DF is calculated based on whether or not the detection signal of the spot SP portion of the ToF method indicates saturation.
 図9のステップS9では、第1ないし第3の融合測距値算出処理のいずれかにより、融合測距値DFが算出され、測距装置21から後段に出力され、図9の距離測定処理が終了する。 In step S9 of FIG. 9, the fusion distance measurement value DF is calculated by any of the first to third fusion distance measurement value calculation processes, and is output from the distance measurement device 21 to the subsequent stage, and the distance measurement process of FIG. 9 is performed. finish.
 融合測距値算出処理は、パターン光15の複数のスポットSPそれぞれについて実行されるので、図14に示されるように、融合測距値DFが、ToF方式で得られた第1の測距値D1に相当するか、または、SL方式で得られた第2の測距値D2に相当するかが、スポットSPごとに異なる。図14において、各スポットSPの付近に“ToF”と記されたスポットSPは、融合測距値DFとして、ToF方式で得られた第1の測距値D1が選択されたことを表し、“SL”と記されたスポットSPは、融合測距値DFとして、SL方式で得られた第2の測距値D2が選択されたことを表している。 Since the fusion distance measurement value calculation process is executed for each of the plurality of spot SPs of the pattern light 15, as shown in FIG. 14, the fusion distance measurement value DF is the first distance measurement value obtained by the ToF method. Whether it corresponds to D1 or the second ranging value D2 obtained by the SL method differs for each spot SP. In FIG. 14, the spot SP marked “ToF” in the vicinity of each spot SP indicates that the first ranging value D1 obtained by the ToF method was selected as the fusion ranging value DF, and “ The spot SP marked "SL" indicates that the second ranging value D2 obtained by the SL method was selected as the fusion ranging value DF.
 以上の測距システム1による距離測定処理によれば、1画素に2つの電荷蓄積部(2タップ)を備える測距センサ42を駆動して得られた第1のタップおよび第2のタップそれぞれの検出信号を用いて、ToF方式による第1の測距値D1と、SL方式による第2の測距値D2の2つの測距値を算出することができる。そして、測距システム1は、それら2つの測距値を融合した融合測距値DFを出力することができる。 According to the distance measurement process by the distance measurement system 1 described above, each of the first tap and the second tap obtained by driving the distance measurement sensor 42 having two charge storage units (2 taps) in one pixel. Using the detection signal, it is possible to calculate two distance measurement values, a first distance measurement value D1 by the ToF method and a second distance measurement value D2 by the SL method. Then, the distance measuring system 1 can output a fusion distance measuring value DF in which these two distance measuring values are fused.
 2つの検出方式による測距値を融合して出力することにより、光源装置11が照射する照射光として、発光強度が大きいスポット光を使用し、距離の近い物体において画素が飽和してしまうような場合においても、距離を測定することができる。すなわち、スポット光を用いて、広い測定範囲を実現できる。 By fusing and outputting the distance measurement values obtained by the two detection methods, spot light having a high emission intensity is used as the irradiation light emitted by the light source device 11, and the pixels are saturated in an object having a short distance. Even in the case, the distance can be measured. That is, a wide measurement range can be realized by using spot light.
<6.高解像度化処理>
 上述した距離測定処理において、測距値を算出できる箇所は、測距センサ42の受光領域のうち、パターン光15の明部であるスポットSPの位置に限られるため、高い発光強度による測距精度の向上と、測定範囲の長距離化というメリットはあるものの、照射光が平面光である場合と比較すると、解像度は低くなる。
<6. High resolution processing>
In the distance measurement process described above, the distance measurement value can be calculated only in the position of the spot SP, which is the bright part of the pattern light 15, in the light receiving region of the distance measurement sensor 42. Although there are merits of improving the above and extending the measurement range, the resolution is lower than that when the irradiation light is flat light.
 そこで、測距システム1は、複数のスポットSPの融合測距値DFを用いて、複数のスポットSPの間の暗部の位置の融合測距値DFを算出する高解像度化処理を実行し、測距値の解像度を向上させることができる。 Therefore, the distance measuring system 1 uses the fusion distance measurement value DF of the plurality of spot SPs to execute a high-resolution processing for calculating the fusion distance measurement value DF of the position of the dark part between the plurality of spot SPs, and measures the resolution. The resolution of the distance value can be improved.
 図15を参照して、信号処理部43が実行する高解像度化処理について説明する。 The high resolution processing executed by the signal processing unit 43 will be described with reference to FIG.
 信号処理部43は、高解像度化処理として、バイラテラルアップサンプリングの手法を用いて、複数の明部の間にある暗部の融合測距値DFを算出する。 The signal processing unit 43 calculates the fusion distance measurement value DF of the dark part between the plurality of bright parts by using the bilateral upsampling method as the high resolution processing.
 図15に示されるように、測距センサ42で受光されたパターン光15を構成する複数のスポットSPのうち、3つのスポットSPA、SPB、および、SPCの間にある暗部の位置Pの融合測距値DFPを算出する例について説明する。 As shown in FIG. 15, among the plurality of spots SP constituting the pattern light 15 received by the distance measuring sensor 42, the positions P of the dark part between the three spots SP A , SP B , and SP C. examples will be described for calculating the fusion distance value DF P.
 信号処理部43は、算出対象の暗部の位置Pの周囲にある明部である複数のスポットSP(の位置)を検出する。図15の例では、3つのスポットSPA、SPB、および、SPCの位置が検出される。 The signal processing unit 43 detects (positions) of a plurality of spots SP (positions) which are bright parts around the position P of the dark part to be calculated. In the example of FIG. 15, the positions of the three spots SP A , SP B , and SP C are detected.
 次に、信号処理部43は、3つのスポットSPA、SPB、および、SPCのそれぞれの周囲の暗部の位置A’、B’、および、C’の検出信号IA’、IB’、およびIC’を取得する。そして、信号処理部43は、暗部の位置A’、B’、または、C’のいずれかを位置Q(Q=A’、B’、または、C’のいずれか)として、算出対象の暗部の位置Pの検出信号IPと、位置Qの検出信号IQ’との類似度wPQ(η)を算出する。ここで、ηは、算出対象の暗部の位置Pの検出信号IPと、周囲の暗部の位置Qの検出信号IQ’との差分(η=(IP-IQ’))を表す。類似度wPQ(η)は、検出信号の差分ηが小さいほど、類似度が大きくなるような関数とすることができる。類似度w(η)の関数としては、例えば、ガウス関数などを用いることができる。 Then, the signal processor 43, three spots SP A, SP B, and the position of the dark part of the respective periphery of the SP C A ', B', and, 'detection signal I A of' C, I B ' , And I C'get . Then, the signal processing unit 43 sets any of the positions A', B', or C'of the dark part as the position Q (either Q = A', B', or C'), and sets the dark part to be calculated. to the calculated detection signal I P position P, the similarity w PQ and the detection signal I Q 'position Q a (eta). Here, η represents the difference (η = (I P −I Q ′)) between the detection signal I P of the position P of the dark part to be calculated and the detection signal I Q ′ of the position Q of the surrounding dark part. The similarity w PQ (η) can be a function such that the smaller the difference η of the detected signals, the greater the similarity. As the function of the similarity w (η), for example, a Gaussian function or the like can be used.
 次に、信号処理部43は、3つのスポットSPA、SPB、および、SPCそれぞれの周囲の暗部の位置A’、B’、および、C’と、算出対象の暗部の位置Pとの類似度wPQ(η)を、周囲の暗部の位置A’、B’、および、C’それぞれの重みとして、次式(5)の重み付け平均により、算出対象の暗部の位置Pの融合測距値DFPを算出する。
Figure JPOXMLDOC01-appb-M000005
Next, the signal processing unit 43 sets the positions A', B', and C'of the dark areas around the three spots SP A , SP B , and SP C, respectively, and the position P of the dark area to be calculated. With the similarity w PQ (η) as the weights of the surrounding dark areas A', B', and C', the fusion distance measurement of the dark area position P to be calculated is performed by the weighted average of the following equation (5). to calculate the value DF P.
Figure JPOXMLDOC01-appb-M000005
 以上のように、信号処理部43は、算出対象の暗部の位置Pの周囲にある複数の明部(スポットSPA、SPB、および、SPC)それぞれの周辺の暗部(位置A’、B’、および、C’)の検出信号IQ’検出信号IA’、IB’、およびIC’)を、リファレンスとして、バイラテラルアップサンプリングの手法により、位置Pの融合測距値DFPを算出する。同一フレームの検出信号をリファレンスとして用いて、測距値の解像度を向上させることができる。 As described above, the signal processing unit 43 has the dark areas (positions A', B) around each of the plurality of bright areas (spots SP A , SP B , and SP C ) around the position P of the dark area to be calculated. ', and, C' detection signal I Q 'detection signals I a' of), I B ', and I C' a), as a reference, the method of bilateral upsampling, fusion distance value DF P position P Is calculated. The resolution of the distance measurement value can be improved by using the detection signal of the same frame as a reference.
<7.距離測定処理>
 図16は、上述した高解像度化処理を含む距離測定処理のフローチャートである。
<7. Distance measurement processing>
FIG. 16 is a flowchart of the distance measurement process including the above-mentioned high resolution process.
 図16のステップS61ないしS66の処理は、それぞれ、図9に示したステップS1ないしS6の処理と同一であるので、その説明は省略する。 Since the processes of steps S61 to S66 of FIG. 16 are the same as the processes of steps S1 to S6 shown in FIG. 9, the description thereof will be omitted.
 ステップS61ないしS66の処理の後に、ステップS67の高解像度化処理が実行される。すなわち、ステップS67において、信号処理部43は、パターン光15を構成する複数のスポットSPのあいだの位置にある算出対象の暗部の位置Pについて、位置Pの周囲にある明部の周辺の暗部の検出信号Iを、リファレンスとして、バイラテラルアップサンプリングの手法により、位置Pの融合測距値DFPを算出する。算出対象の暗部の位置Pは、パターン光15の領域内で、複数設定することができる。 After the processing of steps S61 to S66, the high resolution processing of step S67 is executed. That is, in step S67, the signal processing unit 43 refers to the position P of the dark portion to be calculated at the position between the plurality of spots SP constituting the pattern light 15, and is located in the dark portion around the bright portion around the position P. Using the detection signal I as a reference, the fusion distance measurement value DFP at position P is calculated by the bilateral upsampling method. A plurality of positions P of the dark portion to be calculated can be set within the region of the pattern light 15.
 パターン光15を構成する複数のスポットSPそれぞれの融合測距値DFと、高解像度化処理により追加された複数の位置Pの融合測距値DFPとが、高解像度化後の融合測距値として、測距装置21から後段に出力され、図16の距離測定処理が終了する。 Each and fusion distance measurement value DF plurality of spots SP constituting the pattern light 15, a high resolution and fusion distance value DF P of the plurality of positions P added by treatment, fusion distance value after resolution enhancement Is output from the distance measuring device 21 to the subsequent stage, and the distance measurement process of FIG. 16 is completed.
<8.測距システムの変形例>
 図17および図18は、本技術を適用した測距システムのその他の構成例を示している。
<8. Modification example of distance measurement system>
17 and 18 show other configuration examples of the ranging system to which the present technology is applied.
 図17の測距システム1は、図1の光源装置11が、光源装置111に置き換えられている点が図1の測距システム1と異なり、その他の点は、図1と同様に構成されている。 The distance measuring system 1 of FIG. 17 differs from the distance measuring system 1 of FIG. 1 in that the light source device 11 of FIG. 1 is replaced with the light source device 111, and other points are configured in the same manner as in FIG. There is.
 光源装置111は、明部と暗部の2種類の輝度を有するパターン光15の他、略矩形状のエリア全体の発光輝度が所定の輝度範囲内で均一な平面光121も、切り替えにより照射することができる。 The light source device 111 irradiates not only the pattern light 15 having two types of brightness of the bright part and the dark part, but also the flat light 121 in which the emission brightness of the entire substantially rectangular area is uniform within a predetermined brightness range by switching. Can be done.
 光源装置111は、第1のタイミングでパターン光15を所定の物体OBJに照射する。また、光源装置111は、第2のタイミングで平面光121を所定の物体OBJに照射する。第1のタイミングと第2のタイミングとは、任意の異なるタイミングでもよいし、予め設定された所定の間隔を有するタイミングでもよい。つまり、パターン光15の照射と、平面光121の照射とは、個別に独立して行われてもよいし、一定の時間関係を持って連続して行われてもよい。 The light source device 111 irradiates the predetermined object OBJ with the pattern light 15 at the first timing. Further, the light source device 111 irradiates the predetermined object OBJ with the plane light 121 at the second timing. The first timing and the second timing may be arbitrary different timings, or may be timings having a predetermined predetermined interval. That is, the irradiation of the pattern light 15 and the irradiation of the plane light 121 may be individually performed independently or continuously with a fixed time relationship.
 測距装置21は、光源装置111からパターン光15が照射された場合、パターン光15が物体OBJで反射された反射光を受光し、複数のスポットSPそれぞれの融合測距値DFを算出する。また、測距装置21は、光源装置111から平面光121が照射された場合、平面光121が物体OBJで反射された反射光を受光し、画素アレイ部63の画素71ごとにToF方式による測距値D3を算出する。 When the pattern light 15 is irradiated from the light source device 111, the distance measuring device 21 receives the reflected light reflected by the object OBJ and calculates the fusion distance measuring value DF for each of the plurality of spots SP. Further, when the plane light 121 is irradiated from the light source device 111, the distance measuring device 21 receives the reflected light reflected by the object OBJ, and measures each pixel 71 of the pixel array unit 63 by the ToF method. The distance value D3 is calculated.
 測距装置21は、算出した融合測距値DFまたは測距値D3を、測定結果として出力することができる。 The distance measuring device 21 can output the calculated fusion distance measuring value DF or the distance measuring value D3 as a measurement result.
 また、測距装置21は、高解像度化処理を含む距離測定処理を実行する場合、平面光121で測定した測距値D3をリファレンスとして、バイラテラルアップサンプリングの手法により、位置Pの融合測距値DFPを算出して、高解像度化処理後の融合測距値を出力することができる。 Further, when the distance measuring device 21 executes the distance measuring process including the high resolution processing, the distance measuring device 21 uses the distance measuring value D3 measured by the plane light 121 as a reference and uses the bilateral upsampling method to perform the fusion distance measuring of the position P. It calculates the value DF P, it is possible to output the fusion distance value after resolution enhancement.
 なお、光源装置111が、パターン光15と平面光121を切り替えて照射する構成ではなく、パターン光15を照射する光源装置11とは別に、平面光121を照射する光源装置を追加して設けてもよい。 It should be noted that the light source device 111 does not have a configuration in which the pattern light 15 and the plane light 121 are switched and irradiated, but a light source device for irradiating the plane light 121 is additionally provided separately from the light source device 11 for irradiating the pattern light 15. May be good.
 図18の測距システム1は、RGBの波長を含む可視光を撮像可能なRGBセンサ141と、そこに入射される可視光を集光する受光側光学系142をさらに有する点が図1の測距システム1と異なり、その他の点は、図1と同様に構成されている。 The distance measuring system 1 of FIG. 18 further includes an RGB sensor 141 capable of capturing visible light including RGB wavelengths and a light receiving side optical system 142 that collects visible light incident on the RGB sensor 141. Unlike the distance system 1, other points are configured in the same manner as in FIG.
 RGBセンサ141は、測距装置21の受光範囲と同一の撮像範囲を有し、測距装置21と同一の被写体を撮像し、撮像画像を生成する。 The RGB sensor 141 has the same imaging range as the light receiving range of the ranging device 21, captures the same subject as the ranging device 21, and generates an captured image.
 測距装置21は、高解像度化処理を含む距離測定処理を実行する場合、RGBセンサ141により測距装置21と同時刻に得られた撮像画像をリファレンスとして、バイラテラルアップサンプリングの手法により、位置Pの融合測距値DFPを算出して、高解像度化処理後の融合測距値を出力することができる。 When the distance measuring device 21 executes the distance measurement processing including the high resolution processing, the position is determined by the bilateral upsampling method with reference to the image captured by the RGB sensor 141 at the same time as the distance measuring device 21. The fusion distance measurement value of P DFP can be calculated and the fusion distance measurement value after the high resolution processing can be output.
 また、RGBセンサ141の代わりに、赤外光を撮像可能なIRセンサを用いてもよい。この場合、IRセンサにより測距装置21と同時刻に得られた撮像画像をリファレンスとして、バイラテラルアップサンプリングの手法により、位置Pの融合測距値DFPを算出して、高解像度化処理後の融合測距値を出力することができる。 Further, instead of the RGB sensor 141, an IR sensor capable of capturing infrared light may be used. In this case, the fusion distance measurement value DFP of the position P is calculated by the bilateral upsampling method using the captured image obtained at the same time as the distance measurement device 21 by the IR sensor as a reference, and after the high resolution processing. It is possible to output the fusion distance measurement value of.
 図17と図18の測距システム1の各構成の組合せ、例えば、光源装置111、発光側光学系12、測距装置21、および、受光側光学系22と、RGBセンサ141および受光側光学系142とを有する構成も可能である。この場合、平面光121で測定した測距値D3をリファレンスとすることもできるし、RGBセンサ141により得られた撮像画像をリファレンスとして、高解像度化処理後の融合測距値を出力することができる。 The combination of each configuration of the distance measuring system 1 of FIGS. 17 and 18, for example, the light source device 111, the light emitting side optical system 12, the distance measuring device 21, the light receiving side optical system 22, the RGB sensor 141, and the light receiving side optical system. A configuration having 142 is also possible. In this case, the distance measurement value D3 measured by the plane light 121 can be used as a reference, or the fusion distance measurement value after the high resolution processing can be output by using the image captured by the RGB sensor 141 as a reference. it can.
<9.電子機器の構成例>
 上述した測距システム1は、例えば、スマートフォン、タブレット型端末、携帯電話機、パーソナルコンピュータ、ゲーム機、テレビ受像機、ウェアラブル端末、デジタルスチルカメラ、デジタルビデオカメラなどの電子機器に搭載することができる。
<9. Configuration example of electronic device>
The distance measuring system 1 described above can be mounted on electronic devices such as smartphones, tablet terminals, mobile phones, personal computers, game machines, television receivers, wearable terminals, digital still cameras, and digital video cameras.
 図19は、測距システム1を搭載した電子機器としてのスマートフォンの構成例を示すブロック図である。 FIG. 19 is a block diagram showing a configuration example of a smartphone as an electronic device equipped with the distance measuring system 1.
 図19に示すように、スマートフォン201は、測距モジュール202、撮像装置203、ディスプレイ204、スピーカ205、マイクロフォン206、通信モジュール207、センサユニット208、タッチパネル209、および制御ユニット210が、バス211を介して接続されて構成される。また、制御ユニット210では、CPUがプログラムを実行することによって、アプリケーション処理部221およびオペレーションシステム処理部222としての機能を備える。 As shown in FIG. 19, in the smartphone 201, the distance measuring module 202, the image pickup device 203, the display 204, the speaker 205, the microphone 206, the communication module 207, the sensor unit 208, the touch panel 209, and the control unit 210 are connected via the bus 211. Is connected and configured. Further, the control unit 210 has functions as an application processing unit 221 and an operation system processing unit 222 by executing a program by the CPU.
 測距モジュール202には、図1の測距システム1が適用される。例えば、測距モジュール202は、スマートフォン201の前面に配置され、スマートフォン201のユーザを対象とした測距を行うことにより、そのユーザの顔や手、指などの表面形状のデプス値を測距結果として出力することができる。 The distance measuring system 1 of FIG. 1 is applied to the distance measuring module 202. For example, the distance measuring module 202 is arranged in front of the smartphone 201, and by performing distance measurement for the user of the smartphone 201, the depth value of the surface shape of the user's face, hand, finger, etc. is measured as a distance measurement result. Can be output as.
 撮像装置203は、スマートフォン201の前面に配置され、スマートフォン201のユーザを被写体とした撮像を行うことにより、そのユーザが写された画像を取得する。なお、図示しないが、スマートフォン201の背面にも撮像装置203が配置された構成としてもよい。 The image pickup device 203 is arranged in front of the smartphone 201, and by taking an image of the user of the smartphone 201 as a subject, the image taken by the user is acquired. Although not shown, the image pickup device 203 may be arranged on the back surface of the smartphone 201.
 ディスプレイ204は、アプリケーション処理部221およびオペレーションシステム処理部222による処理を行うための操作画面や、撮像装置203が撮像した画像などを表示する。スピーカ205およびマイクロフォン206は、例えば、スマートフォン201により通話を行う際に、相手側の音声の出力、および、ユーザの音声の収音を行う。 The display 204 displays an operation screen for performing processing by the application processing unit 221 and the operation system processing unit 222, an image captured by the image pickup device 203, and the like. The speaker 205 and the microphone 206, for example, output the voice of the other party and collect the voice of the user when making a call by the smartphone 201.
 通信モジュール207は、通信ネットワークを介した通信を行う。センサユニット208は、速度や加速度、近接などをセンシングし、タッチパネル209は、ディスプレイ204に表示されている操作画面に対するユーザによるタッチ操作を取得する。 The communication module 207 communicates via the communication network. The sensor unit 208 senses speed, acceleration, proximity, etc., and the touch panel 209 acquires a touch operation by the user on the operation screen displayed on the display 204.
 アプリケーション処理部221は、スマートフォン201によって様々なサービスを提供するための処理を行う。例えば、アプリケーション処理部221は、測距モジュール202から供給されるデプスに基づいて、ユーザの表情をバーチャルに再現したコンピュータグラフィックスによる顔を作成し、ディスプレイ204に表示する処理を行うことができる。また、アプリケーション処理部221は、測距モジュール202から供給されるデプスに基づいて、例えば、任意の立体的な物体の三次元形状データを作成する処理を行うことができる。 The application processing unit 221 performs processing for providing various services by the smartphone 201. For example, the application processing unit 221 can create a face by computer graphics that virtually reproduces the user's facial expression based on the depth supplied from the distance measuring module 202, and can perform a process of displaying the face on the display 204. Further, the application processing unit 221 can perform a process of creating, for example, three-dimensional shape data of an arbitrary three-dimensional object based on the depth supplied from the distance measuring module 202.
 オペレーションシステム処理部222は、スマートフォン201の基本的な機能および動作を実現するための処理を行う。例えば、オペレーションシステム処理部222は、測距モジュール202から供給されるデプス値に基づいて、ユーザの顔を認証し、スマートフォン201のロックを解除する処理を行うことができる。また、オペレーションシステム処理部222は、測距モジュール202から供給されるデプス値に基づいて、例えば、ユーザのジェスチャを認識する処理を行い、そのジェスチャに従った各種の操作を入力する処理を行うことができる。 The operation system processing unit 222 performs processing for realizing the basic functions and operations of the smartphone 201. For example, the operation system processing unit 222 can perform a process of authenticating the user's face and unlocking the smartphone 201 based on the depth value supplied from the distance measuring module 202. Further, the operation system processing unit 222 performs, for example, a process of recognizing a user's gesture based on the depth value supplied from the distance measuring module 202, and performs a process of inputting various operations according to the gesture. Can be done.
 このように構成されているスマートフォン201では、上述した測距システム1を適用することで、例えば、高精度かつ高速にデプスマップを生成することができる。これにより、スマートフォン201は、測距情報をより正確に検出することができる。 In the smartphone 201 configured in this way, by applying the distance measuring system 1 described above, for example, a depth map can be generated with high accuracy and high speed. As a result, the smartphone 201 can detect the distance measurement information more accurately.
<10.移動体への応用例>
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
<10. Application example to mobile>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
 図20は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 20 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図20に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。 The vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001. In the example shown in FIG. 20, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050. Further, as a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps. In this case, the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches. The body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
 車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000. For example, the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image. The vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or characters on the road surface based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received. The image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The in-vehicle information detection unit 12040 detects the in-vehicle information. For example, a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(ADvanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit. A control command can be output to 12010. For example, the microcomputer 12051 realizes ADAS (ADvanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, or vehicle lane deviation warning. It is possible to perform cooperative control for the purpose of.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 Further, the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Further, the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030. For example, the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図20の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The audio image output unit 12052 transmits the output signal of at least one of the audio and the image to the output device capable of visually or audibly notifying the passenger or the outside of the vehicle of the information. In the example of FIG. 20, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices. The display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
 図21は、撮像部12031の設置位置の例を示す図である。 FIG. 21 is a diagram showing an example of the installation position of the imaging unit 12031.
 図21では、車両12100は、撮像部12031として、撮像部12101,12102,12103,12104,12105を有する。 In FIG. 21, the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
 撮像部12101,12102,12103,12104,12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102,12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example. The imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100. The imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100. The imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100. The images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図21には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 Note that FIG. 21 shows an example of the photographing range of the imaging units 12101 to 12104. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively, and the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103. The imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100). By obtaining, it is possible to extract as the preceding vehicle a three-dimensional object that is the closest three-dimensional object on the traveling path of the vehicle 12100 and that travels in substantially the same direction as the vehicle 12100 at a predetermined speed (for example, 0 km / h or more). it can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104. Such pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine. When the microcomputer 12051 determines that a pedestrian is present in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian. The display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
 以上、本開示に係る技術が適用され得る車両制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、車外情報検出ユニット12030や車内情報検出ユニット12040に適用され得る。具体的には、車外情報検出ユニット12030や車内情報検出ユニット12040として測距システム1による測距を利用することで、運転者のジェスチャを認識する処理を行い、そのジェスチャに従った各種(例えば、オーディオシステム、ナビゲーションシステム、エアーコンディショニングシステム)の操作を実行したり、より正確に運転者の状態を検出することができる。また、測距システム1による測距を利用して、路面の凹凸を認識して、サスペンションの制御に反映させたりすることができる。 The above is an example of a vehicle control system to which the technology according to the present disclosure can be applied. The technique according to the present disclosure can be applied to the vehicle exterior information detection unit 12030 and the vehicle interior information detection unit 12040 among the configurations described above. Specifically, by using the distance measurement by the distance measurement system 1 as the outside information detection unit 12030 and the inside information detection unit 12040, processing for recognizing the driver's gesture is performed, and various types according to the gesture (for example, It can perform operations on audio systems, navigation systems, air conditioning systems) and detect the driver's condition more accurately. Further, the distance measurement by the distance measurement system 1 can be used to recognize the unevenness of the road surface and reflect it in the control of the suspension.
 本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 The embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
 本明細書において複数説明した本技術は、矛盾が生じない限り、それぞれ独立に単体で実施することができる。もちろん、任意の複数の本技術を併用して実施することもできる。例えば、いずれかの実施の形態において説明した本技術の一部または全部を、他の実施の形態において説明した本技術の一部または全部と組み合わせて実施することもできる。また、上述した任意の本技術の一部または全部を、上述していない他の技術と併用して実施することもできる。 The present techniques described above in this specification can be independently implemented independently as long as there is no contradiction. Of course, any plurality of the present technologies can be used in combination. For example, some or all of the techniques described in any of the embodiments may be combined with some or all of the techniques described in other embodiments. It is also possible to carry out a part or all of any of the above-mentioned techniques in combination with other techniques not described above.
 また、例えば、1つの装置(または処理部)として説明した構成を分割し、複数の装置(または処理部)として構成するようにしてもよい。逆に、以上において複数の装置(または処理部)として説明した構成をまとめて1つの装置(または処理部)として構成されるようにしてもよい。また、各装置(または各処理部)の構成に上述した以外の構成を付加するようにしてももちろんよい。さらに、システム全体としての構成や動作が実質的に同じであれば、ある装置(または処理部)の構成の一部を他の装置(または他の処理部)の構成に含めるようにしてもよい。 Further, for example, the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units). On the contrary, the configurations described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit). Further, of course, a configuration other than the above may be added to the configuration of each device (or each processing unit). Further, if the configuration and operation of the entire system are substantially the same, a part of the configuration of one device (or processing unit) may be included in the configuration of another device (or other processing unit). ..
 さらに、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 Further, in the present specification, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
 なお、本明細書に記載された効果はあくまで例示であって限定されるものではなく、本明細書に記載されたもの以外の効果があってもよい。 It should be noted that the effects described in the present specification are merely examples and are not limited, and effects other than those described in the present specification may be obtained.
 なお、本技術は、以下の構成を取ることができる。
(1)
 光源装置から照射された明部と暗部の2種類の輝度を有するパターン光が物体で反射されて返ってきた反射光を受光する測距センサと、
 前記パターン光が照射されてから前記反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する信号処理部と
 を備える測距装置。
(2)
 前記信号処理部は、前記パターン光の明部に対応する画素の検出信号と、前記パターン光の暗部に対応する画素の検出信号との差分に基づいて、前記位相差を検出する
 前記(1)に記載の測距装置。
(3)
 前記信号処理部は、前記パターン光の明部の位置を検出し、検出した前記明部の前記光源装置の位置を取得して、三角測量の原理により前記第2の測距値を算出する
 前記(1)または(2)に記載の測距装置。
(4)
 前記信号処理部は、前記第1の測距値と前記第2の測距値とを用いた距離評価指標の値に基づいて、前記融合測距値を算出する
 前記(1)乃至(3)のいずれかに記載の測距装置。
(5)
 前記信号処理部は、前記第1の測距値を算出する第1の方式と前記第2の測距値を算出する第2の方式それぞれの誤差指標に基づいて、前記融合測距値を算出する
 前記(1)乃至(3)のいずれかに記載の測距装置。
(6)
 前記信号処理部は、前記位相差を検出する際の画素の検出信号が飽和を示すか否かに基づいて、前記融合測距値を算出する
 前記(1)乃至(3)のいずれかに記載の測距装置。
(7)
 前記パターン光は、ドットが規則的または不規則な所定の間隔で配置されたドットパターンである
 前記(1)乃至(6)のいずれかに記載の測距装置。
(8)
 前記信号処理部は、前記暗部の位置の融合測距値を算出する高解像度化処理をさらに実行する
 前記(1)乃至(7)のいずれかに記載の測距装置。
(9)
 前記信号処理部は、バイラテラルアップサンプリングの手法を用いて、複数の明部の間にある前記暗部の位置の融合測距値を算出する
 前記(8)に記載の測距装置。
(10)
 前記信号処理部は、前記複数の明部それぞれの周辺の暗部の検出信号をリファレンスとして、前記暗部の位置の融合測距値を算出する
 前記(9)に記載の測距装置。
(11)
 前記信号処理部は、前記複数の明部の検出信号の重み付け平均により、前記暗部の位置の融合測距値を算出する
 前記(9)または(10)に記載の測距装置。
(12)
 前記信号処理部は、前記測距センサから得られた同一のフレームの検出信号を用いて、前記第1の測距値と前記第2の測距値とを算出する
 前記(1)乃至(11)のいずれかに記載の測距装置。
(13)
 前記信号処理部は、前記パターン光の前または後で前記光源装置が照射した平面光を前記測距センサが受光した検出信号を、リファレンスとして、前記暗部の位置の融合測距値を算出する
 前記(1)乃至(11)のいずれかに記載の測距装置。
(14)
 前記信号処理部は、RGBセンサが受光した検出信号をリファレンスとして、前記暗部の位置の融合測距値を算出する
 前記(1)乃至(11)のいずれかに記載の測距装置。
(15)
 測距装置が、
 光源装置から照射された明部と暗部の2種類の輝度を有するパターン光が照射されてから物体で反射されて反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する
 測定方法。
(16)
 明部と暗部の2種類の輝度を有するパターン光を照射する光源装置と、
 前記パターン光が物体で反射されて返ってきた反射光を受光する測距装置と
 を備え、
 前記測距装置は、
  前記反射光を受光する測距センサと、
  前記パターン光が照射されてから前記反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する信号処理部と
 を備える
 測距システム。
(17)
 前記光源装置は、前記パターン光と、エリア全体の発光輝度が所定の輝度範囲内で均一な平面光とを切り替えにより照射する
 前記(16)に記載の測距システム。
The present technology can have the following configurations.
(1)
A ranging sensor that receives the reflected light that is reflected by an object and has two types of brightness, a bright part and a dark part that are emitted from the light source device.
The time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned The position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value. A distance measuring device including a signal processing unit for calculating a fusion distance measuring value obtained by fusing the second distance measuring value and the second distance measuring value.
(2)
The signal processing unit detects the phase difference based on the difference between the detection signal of the pixel corresponding to the bright part of the pattern light and the detection signal of the pixel corresponding to the dark part of the pattern light (1). The distance measuring device described in.
(3)
The signal processing unit detects the position of the bright portion of the pattern light, acquires the position of the detected light source device in the bright portion, and calculates the second ranging value by the principle of triangulation. The distance measuring device according to (1) or (2).
(4)
The signal processing unit calculates the fusion distance measurement value based on the value of the distance evaluation index using the first distance measurement value and the second distance measurement value (1) to (3). The distance measuring device according to any one of.
(5)
The signal processing unit calculates the fusion distance measurement value based on the error indexes of each of the first method for calculating the first distance measurement value and the second method for calculating the second distance measurement value. The distance measuring device according to any one of (1) to (3) above.
(6)
The signal processing unit calculates the fusion distance measurement value based on whether or not the detection signal of the pixel at the time of detecting the phase difference shows saturation. Described in any one of (1) to (3). Distance measuring device.
(7)
The distance measuring device according to any one of (1) to (6) above, wherein the pattern light is a dot pattern in which dots are arranged at regular or irregular predetermined intervals.
(8)
The distance measuring device according to any one of (1) to (7) above, wherein the signal processing unit further executes a high resolution process for calculating a fusion distance measuring value at the position of the dark part.
(9)
The distance measuring device according to (8) above, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part between a plurality of bright parts by using a bilateral upsampling method.
(10)
The distance measuring device according to (9) above, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part with reference to a detection signal of a dark part around each of the plurality of bright parts.
(11)
The distance measuring device according to (9) or (10) above, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part by weighting averaging of the detection signals of the plurality of bright parts.
(12)
The signal processing unit calculates the first distance measurement value and the second distance measurement value using the detection signals of the same frame obtained from the distance measurement sensor (1) to (11). ). The distance measuring device according to any one of.
(13)
The signal processing unit calculates the fusion distance measurement value of the position of the dark portion by using the detection signal received by the distance measurement sensor as a reference for the plane light emitted by the light source device before or after the pattern light. The distance measuring device according to any one of (1) to (11).
(14)
The distance measuring device according to any one of (1) to (11) above, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part with reference to a detection signal received by the RGB sensor.
(15)
The distance measuring device
The time from when the pattern light having two kinds of brightness of the bright part and the dark part irradiated from the light source device is irradiated to when it is reflected by the object and received as the reflected light is detected as the phase difference, and is based on the phase difference. The first ranging value, which is the distance to the object, is calculated, the position of the bright part of the pattern light is detected, and the detected position of the bright part is used to reach the object by the principle of triangular measurement. A measuring method in which a second distance measurement value, which is a distance, is calculated, and a fusion distance measurement value obtained by fusing the first distance measurement value and the second distance measurement value is calculated.
(16)
A light source device that irradiates a pattern light having two types of brightness, a bright part and a dark part,
It is equipped with a ranging device that receives the reflected light that is reflected by the object and returned.
The distance measuring device is
A distance measuring sensor that receives the reflected light and
The time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned The position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value is obtained. A distance measuring system including a signal processing unit that calculates a fusion distance measuring value in which the second distance measuring value is fused.
(17)
The distance measuring system according to (16), wherein the light source device irradiates the pattern light and a flat light whose emission brightness of the entire area is uniform within a predetermined brightness range by switching.
 1 測距システム, 11 光源装置, 15 パターン光, 21 測距装置, 31 発光源, 32 光源駆動部, 41 同期制御部, 42 測距センサ, 43 信号処理部, 63 画素アレイ部, 66 信号処理部, 71 画素, 81 光電変換素子, 111 光源装置, 121 平面光, 141 RGBセンサ, 201 スマートフォン, 202 測距モジュール 1 distance measurement system, 11 light source device, 15 pattern light, 21 distance measurement device, 31 light emission source, 32 light source drive unit, 41 synchronization control unit, 42 distance measurement sensor, 43 signal processing unit, 63 pixel array unit, 66 signal processing Department, 71 pixels, 81 photoelectric conversion element, 111 light source device, 121 plane light, 141 RGB sensor, 201 smartphone, 202 ranging module

Claims (17)

  1.  光源装置から照射された明部と暗部の2種類の輝度を有するパターン光が物体で反射されて返ってきた反射光を受光する測距センサと、
     前記パターン光が照射されてから前記反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する信号処理部と
     を備える測距装置。
    A ranging sensor that receives the reflected light that is reflected by an object and has two types of brightness, a bright part and a dark part that are emitted from the light source device.
    The time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned The position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value. A distance measuring device including a signal processing unit for calculating a fusion distance measuring value obtained by fusing the second distance measuring value and the second distance measuring value.
  2.  前記信号処理部は、前記パターン光の明部に対応する画素の検出信号と、前記パターン光の暗部に対応する画素の検出信号との差分に基づいて、前記位相差を検出する
     請求項1に記載の測距装置。
    The signal processing unit detects the phase difference based on the difference between the detection signal of the pixel corresponding to the bright part of the pattern light and the detection signal of the pixel corresponding to the dark part of the pattern light. The distance measuring device described.
  3.  前記信号処理部は、前記パターン光の明部の位置を検出し、検出した前記明部の前記光源装置の位置を取得して、三角測量の原理により前記第2の測距値を算出する
     請求項1に記載の測距装置。
    The signal processing unit detects the position of the bright portion of the pattern light, acquires the position of the detected light source device in the bright portion, and calculates the second ranging value by the principle of triangulation. Item 1. The distance measuring device according to item 1.
  4.  前記信号処理部は、前記第1の測距値と前記第2の測距値とを用いた距離評価指標の値に基づいて、前記融合測距値を算出する
     請求項1に記載の測距装置。
    The distance measurement according to claim 1, wherein the signal processing unit calculates the fusion distance measurement value based on the value of the distance evaluation index using the first distance measurement value and the second distance measurement value. apparatus.
  5.  前記信号処理部は、前記第1の測距値を算出する第1の方式と前記第2の測距値を算出する第2の方式それぞれの誤差指標に基づいて、前記融合測距値を算出する
     請求項1に記載の測距装置。
    The signal processing unit calculates the fusion distance measurement value based on the error indexes of each of the first method for calculating the first distance measurement value and the second method for calculating the second distance measurement value. The distance measuring device according to claim 1.
  6.  前記信号処理部は、前記位相差を検出する際の画素の検出信号が飽和を示すか否かに基づいて、前記融合測距値を算出する
     請求項1に記載の測距装置。
    The distance measuring device according to claim 1, wherein the signal processing unit calculates the fusion distance measuring value based on whether or not the detection signal of the pixel at the time of detecting the phase difference shows saturation.
  7.  前記パターン光は、ドットが規則的または不規則な所定の間隔で配置されたドットパターンである
     請求項1に記載の測距装置。
    The distance measuring device according to claim 1, wherein the pattern light is a dot pattern in which dots are arranged at regular or irregular predetermined intervals.
  8.  前記信号処理部は、前記暗部の位置の融合測距値を算出する高解像度化処理をさらに実行する
     請求項1に記載の測距装置。
    The distance measuring device according to claim 1, wherein the signal processing unit further executes a high-resolution processing for calculating a fusion distance measuring value at the position of the dark portion.
  9.  前記信号処理部は、バイラテラルアップサンプリングの手法を用いて、複数の明部の間にある前記暗部の位置の融合測距値を算出する
     請求項8に記載の測距装置。
    The distance measuring device according to claim 8, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part between a plurality of bright parts by using a bilateral upsampling method.
  10.  前記信号処理部は、前記複数の明部それぞれの周辺の暗部の検出信号をリファレンスとして、前記暗部の位置の融合測距値を算出する
     請求項9に記載の測距装置。
    The distance measuring device according to claim 9, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part with reference to a detection signal of a dark part around each of the plurality of bright parts.
  11.  前記信号処理部は、前記複数の明部の検出信号の重み付け平均により、前記暗部の位置の融合測距値を算出する
     請求項9に記載の測距装置。
    The distance measuring device according to claim 9, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part by weighting averaging of the detection signals of the plurality of bright parts.
  12.  前記信号処理部は、前記測距センサから得られた同一のフレームの検出信号を用いて、前記第1の測距値と前記第2の測距値とを算出する
     請求項1に記載の測距装置。
    The measurement according to claim 1, wherein the signal processing unit calculates the first distance measurement value and the second distance measurement value by using the detection signal of the same frame obtained from the distance measurement sensor. Distance device.
  13.  前記信号処理部は、前記パターン光の前または後で前記光源装置が照射した平面光を前記測距センサが受光した検出信号を、リファレンスとして、前記暗部の位置の融合測距値を算出する
     請求項9に記載の測距装置。
    The signal processing unit calculates the fusion distance measurement value of the position of the dark portion by using the detection signal received by the distance measurement sensor as a reference for the plane light emitted by the light source device before or after the pattern light. Item 9. The ranging device according to item 9.
  14.  前記信号処理部は、RGBセンサが受光した検出信号をリファレンスとして、前記暗部の位置の融合測距値を算出する
     請求項9に記載の測距装置。
    The distance measuring device according to claim 9, wherein the signal processing unit calculates a fusion distance measuring value of the position of the dark part with reference to a detection signal received by the RGB sensor.
  15.  測距装置が、
     光源装置から照射された明部と暗部の2種類の輝度を有するパターン光が照射されてから物体で反射されて反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する
     測定方法。
    The distance measuring device
    The time from when the pattern light having two kinds of brightness of the bright part and the dark part irradiated from the light source device is irradiated to when it is reflected by the object and received as the reflected light is detected as the phase difference, and is based on the phase difference. The first ranging value, which is the distance to the object, is calculated, the position of the bright part of the pattern light is detected, and the detected position of the bright part is used to reach the object by the principle of triangular measurement. A measuring method in which a second distance measurement value, which is a distance, is calculated, and a fusion distance measurement value obtained by fusing the first distance measurement value and the second distance measurement value is calculated.
  16.  明部と暗部の2種類の輝度を有するパターン光を照射する光源装置と、
     前記パターン光が物体で反射されて返ってきた反射光を受光する測距装置と
     を備え、
     前記測距装置は、
      前記反射光を受光する測距センサと、
      前記パターン光が照射されてから前記反射光として受光されるまでの時間を位相差として検出し、前記位相差に基づいて前記物体までの距離である第1の測距値を算出するとともに、前記パターン光の明部の位置を検出し、検出した前記明部の位置を用いて三角測量の原理により前記物体までの距離である第2の測距値を算出し、前記第1の測距値と前記第2の測距値とを融合した融合測距値を算出する信号処理部と
     を備える
     測距システム。
    A light source device that irradiates a pattern light having two types of brightness, a bright part and a dark part,
    It is equipped with a ranging device that receives the reflected light that is reflected by the object and returned.
    The distance measuring device is
    A distance measuring sensor that receives the reflected light and
    The time from the irradiation of the pattern light to the reception of the reflected light is detected as a phase difference, and the first distance measurement value, which is the distance to the object, is calculated based on the phase difference, and the above-mentioned The position of the bright part of the pattern light is detected, the second distance measurement value which is the distance to the object is calculated by the principle of triangulation using the detected position of the bright part, and the first distance measurement value is obtained. A distance measuring system including a signal processing unit that calculates a fusion distance measuring value in which the second distance measuring value is fused.
  17.  前記光源装置は、前記パターン光と、エリア全体の発光輝度が所定の輝度範囲内で均一な平面光とを切り替えにより照射する
     請求項16に記載の測距システム。
    The distance measuring system according to claim 16, wherein the light source device irradiates the pattern light and flat light whose emission brightness of the entire area is uniform within a predetermined brightness range by switching.
PCT/JP2020/038712 2019-10-28 2020-10-14 Distance measurement device, measurement method, and distance measurement system WO2021085128A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-195197 2019-10-28
JP2019195197 2019-10-28

Publications (1)

Publication Number Publication Date
WO2021085128A1 true WO2021085128A1 (en) 2021-05-06

Family

ID=75716243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/038712 WO2021085128A1 (en) 2019-10-28 2020-10-14 Distance measurement device, measurement method, and distance measurement system

Country Status (1)

Country Link
WO (1) WO2021085128A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022259640A1 (en) * 2021-06-10 2022-12-15 ソニーセミコンダクタソリューションズ株式会社 Distance measurement sensor, distance measurement device, and distance measurement method
CN115712106A (en) * 2021-08-23 2023-02-24 深圳市速腾聚创科技有限公司 Radar data processing method, terminal equipment and computer readable storage medium
WO2023159974A1 (en) * 2022-02-24 2023-08-31 华为技术有限公司 Ranging method, photoelectric detection module, chip, electronic device and medium
WO2024004645A1 (en) * 2022-06-29 2024-01-04 ソニーセミコンダクタソリューションズ株式会社 Ranging device, ranging system, and ranging method
WO2024029486A1 (en) * 2022-08-04 2024-02-08 株式会社デンソーウェーブ Distance measurement device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001133253A (en) * 1999-08-24 2001-05-18 Asahi Optical Co Ltd Distance measuring instrument
JP2008008687A (en) * 2006-06-27 2008-01-17 Toyota Motor Corp Distance measuring system and distance measuring method
JP2015175644A (en) * 2014-03-13 2015-10-05 株式会社リコー Ranging system, information processing device, information processing method, and program
JP2015210192A (en) * 2014-04-25 2015-11-24 キヤノン株式会社 Metrology device and metrology method
WO2018042801A1 (en) * 2016-09-01 2018-03-08 ソニーセミコンダクタソリューションズ株式会社 Imaging device
US20180093377A1 (en) * 2013-03-15 2018-04-05 X Development Llc Determining a Virtual Representation of an Environment By Projecting Texture Patterns
JP2019095421A (en) * 2017-11-17 2019-06-20 株式会社リコー Distance measuring apparatus, mobile device and distance measuring method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001133253A (en) * 1999-08-24 2001-05-18 Asahi Optical Co Ltd Distance measuring instrument
JP2008008687A (en) * 2006-06-27 2008-01-17 Toyota Motor Corp Distance measuring system and distance measuring method
US20180093377A1 (en) * 2013-03-15 2018-04-05 X Development Llc Determining a Virtual Representation of an Environment By Projecting Texture Patterns
JP2015175644A (en) * 2014-03-13 2015-10-05 株式会社リコー Ranging system, information processing device, information processing method, and program
JP2015210192A (en) * 2014-04-25 2015-11-24 キヤノン株式会社 Metrology device and metrology method
WO2018042801A1 (en) * 2016-09-01 2018-03-08 ソニーセミコンダクタソリューションズ株式会社 Imaging device
JP2019095421A (en) * 2017-11-17 2019-06-20 株式会社リコー Distance measuring apparatus, mobile device and distance measuring method

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022259640A1 (en) * 2021-06-10 2022-12-15 ソニーセミコンダクタソリューションズ株式会社 Distance measurement sensor, distance measurement device, and distance measurement method
CN115712106A (en) * 2021-08-23 2023-02-24 深圳市速腾聚创科技有限公司 Radar data processing method, terminal equipment and computer readable storage medium
EP4141479A1 (en) * 2021-08-23 2023-03-01 Suteng Innovation Technology Co., Ltd Radar data processing method, terminal device, and computer-readable storage medium
JP2023031274A (en) * 2021-08-23 2023-03-08 深セン市速騰聚創科技有限公司 Radar data processing method, terminal device, and computer readable storage medium
JP7352266B2 (en) 2021-08-23 2023-09-28 深セン市速騰聚創科技有限公司 Radar data processing method, terminal device and computer readable storage medium
CN115712106B (en) * 2021-08-23 2023-11-07 深圳市速腾聚创科技有限公司 Radar data processing method, terminal equipment and computer readable storage medium
WO2023159974A1 (en) * 2022-02-24 2023-08-31 华为技术有限公司 Ranging method, photoelectric detection module, chip, electronic device and medium
WO2024004645A1 (en) * 2022-06-29 2024-01-04 ソニーセミコンダクタソリューションズ株式会社 Ranging device, ranging system, and ranging method
WO2024029486A1 (en) * 2022-08-04 2024-02-08 株式会社デンソーウェーブ Distance measurement device

Similar Documents

Publication Publication Date Title
US10746874B2 (en) Ranging module, ranging system, and method of controlling ranging module
WO2021085128A1 (en) Distance measurement device, measurement method, and distance measurement system
WO2018042887A1 (en) Distance measurement device and control method for distance measurement device
JP6972011B2 (en) Distance measuring device and distance measuring method
TW201945757A (en) Distance measurement processing apparatus, distance measurement module, distance measurement processing method, and program
WO2018074085A1 (en) Rangefinder and rangefinder control method
WO2019012756A1 (en) Electronic device and method for controlling electronic device
WO2017195459A1 (en) Imaging device and imaging method
WO2020241294A1 (en) Signal processing device, signal processing method, and ranging module
JP7030607B2 (en) Distance measurement processing device, distance measurement module, distance measurement processing method, and program
WO2021085125A1 (en) Ranging system, drive method, and electronic device
WO2020209079A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
WO2021059682A1 (en) Solid-state imaging element, electronic device, and solid-state imaging element control method
WO2020246264A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
US20220113410A1 (en) Distance measuring device, distance measuring method, and program
WO2021065500A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
WO2021065494A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
WO2021106624A1 (en) Distance measurement sensor, distance measurement system, and electronic apparatus
JP2020118570A (en) Measuring device and distance measuring device
WO2021065542A1 (en) Illumination device, illumination device control method, and distance measurement module
WO2021039458A1 (en) Distance measuring sensor, driving method therefor, and distance measuring module
JP2021182701A (en) Light receiving device, drive control method thereof, and distance measuring device
WO2021106623A1 (en) Distance measurement sensor, distance measurement system, and electronic apparatus
WO2021131684A1 (en) Ranging device, method for controlling ranging device, and electronic apparatus
WO2021065495A1 (en) Ranging sensor, signal processing method, and ranging module

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20883535

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20883535

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP