WO2021161858A1 - Rangefinder and rangefinding method - Google Patents

Rangefinder and rangefinding method Download PDF

Info

Publication number
WO2021161858A1
WO2021161858A1 PCT/JP2021/003780 JP2021003780W WO2021161858A1 WO 2021161858 A1 WO2021161858 A1 WO 2021161858A1 JP 2021003780 W JP2021003780 W JP 2021003780W WO 2021161858 A1 WO2021161858 A1 WO 2021161858A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
light
light receiving
light emitting
control unit
Prior art date
Application number
PCT/JP2021/003780
Other languages
French (fr)
Japanese (ja)
Inventor
貴洋 加戸
俊平 鈴木
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Priority to JP2022500342A priority Critical patent/JPWO2021161858A1/ja
Publication of WO2021161858A1 publication Critical patent/WO2021161858A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates

Definitions

  • This disclosure relates to a distance measuring device and a distance measuring method.
  • a distance measuring device such as LiDAR (Light Detection and Ringing) that measures the distance to an object that is a reflector by emitting laser light to the outside and receiving the reflected light.
  • This type of ranging device is composed of a light emitting unit that emits a plurality of laser beams and one light receiving element that receives the laser beams (see, for example, Patent Document 1).
  • a distance measuring device and a distance measuring method capable of efficiently receiving light are proposed.
  • the ranging device of one form according to the present disclosure includes a light emitting unit, a light receiving unit, and a control unit.
  • the light emitting unit emits light that scans a predetermined ranging range.
  • a plurality of light receiving elements that receive the reflected light reflected by the light emitting unit are arranged two-dimensionally.
  • the control unit controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit among the plurality of light receiving elements and measure the distance.
  • a plurality of components having substantially the same functional configuration may be distinguished by adding different numbers after the same reference numerals. However, if it is not necessary to distinguish each of the plurality of components having substantially the same functional configuration, only the same reference numerals are given.
  • Embodiment 1.1 Distance measuring device (ToF sensor) 1.2 Optical system 1.3 Light receiving part 1.4 SPAD array 1.5 SPAD pixel 1.6 Schematic operation example of SPAD pixel 1.7 SPAD adding part 1.8 Sampling cycle 1.9 Scanning of light emitting part and light receiving part Timing 1.10 Abnormality detection of light emitting part 1.11 Discrimination between reflected light and ambient light 2.
  • FIG. 1 is a block diagram showing a schematic configuration example of a ToF sensor as a distance measuring device according to the present embodiment.
  • the ToF sensor 1 includes a control unit 11, a light emitting unit 13, a light receiving unit 14, a calculation unit 15, and an external interface (I / F) 19.
  • the control unit 11 is composed of, for example, an information processing device such as a CPU (Central Processing Unit), and controls each unit of the ToF sensor 1. Twice
  • the external I / F19 is, for example, a communication network compliant with any standard such as wireless LAN (Local Area Network), wired LAN, CAN (Controller Area Network), LIN (Local Interconnect Network), FlexRay (registered trademark), etc. It may be a communication adapter for establishing communication with an external host 80 via. Twice
  • the host 80 may be, for example, an ECU (Engine Control Unit) mounted on the automobile or the like when the ToF sensor 1 is mounted on the automobile or the like.
  • an autonomous mobile robot such as a domestic pet robot or an autonomous mobile body such as a robot vacuum cleaner, an unmanned aerial vehicle, or a follow-up transport robot
  • the host 80 controls the autonomous mobile body. It may be a control device or the like. Twice
  • the light emitting unit 13 includes, for example, one or a plurality of semiconductor laser diodes as a light source, and emits pulsed laser light L1 having a predetermined time width at a predetermined period (also referred to as a light emitting cycle). Further, the light emitting unit 13 emits the laser beam L1 having a time width of 1 ns (nanosecond) in a period of, for example, 1 MHz (megahertz). For example, when an object 90 exists within the ranging range, the laser beam L1 emitted from the light emitting unit 13 is reflected by the object 90 and is incident on the light receiving unit 14 as reflected light L2. Twice
  • the light receiving unit 14 The details of the light receiving unit 14 will be described later, but for example, the number of SPAD pixels that include a plurality of SPAD pixels arranged in a two-dimensional grid and detect the incident of photons after the light emitting unit 13 emits light (hereinafter, the number of detections). Information about (for example, corresponding to the number of detection signals described later) is output. For example, the light receiving unit 14 detects the incident of photons at a predetermined sampling cycle for one light emission of the light emitting unit 13 and outputs the detected number. Twice
  • the calculation unit 15 aggregates the number of detections output from the light receiving unit 14 for each of a plurality of SPAD pixels (for example, corresponding to one or a plurality of macro pixels described later), and based on the pixel value obtained by the aggregation, the calculation unit 15 is used. Create a histogram with the horizontal axis as the flight time and the vertical axis as the cumulative pixel value. For example, the calculation unit 15 repeatedly executes the calculation of the pixel value by totaling the number of detections at a predetermined sampling frequency for one light emission of the light emitting unit 13 for a plurality of light emissiones of the light emitting unit 13. As a result, a histogram is created in which the horizontal axis (histogram bin) is the sampling cycle corresponding to the flight time, and the vertical axis is the cumulative pixel value obtained by accumulating the pixel values obtained in each sampling cycle.
  • the calculation unit 15 applies a predetermined filter process to the created histogram, and then specifies the flight time when the cumulative pixel value peaks from the filtered histogram. Then, the calculation unit 15 calculates the distance from the ToF sensor 1 or the device on which the ToF sensor 1 is mounted to the object 90 existing in the distance measuring range based on the specified flight time.
  • the distance information calculated by the calculation unit 15 may be output to the host 80 or the like via the external I / F 19, for example.
  • FIG. 2 is a diagram for explaining an optical system of the ToF sensor according to the present embodiment.
  • FIG. 2 illustrates a so-called scan-type optical system that scans the angle of view of the light receiving unit 14 in the horizontal direction, but the present invention is not limited to this, and for example, the angle of view of the light receiving unit 14 is fixed, so-called. It is also possible to use a flash type ToF sensor.
  • the ToF sensor 1 includes a light source 131, a collimator lens 132, a half mirror 133, a galvano mirror 135, a light receiving lens 146, and a SPAD array 141 as an optical system.
  • the light source 131, the collimator lens 132, the half mirror 133, and the galvano mirror 135 are included in the light emitting unit 13 in FIG. 1, for example.
  • the light receiving lens 146 and the SPAD array 141 are included in the light receiving unit 14 in FIG. 1, for example.
  • the laser beam L1 emitted from the light source 131 is converted into rectangular parallel light whose cross-sectional intensity spectrum is long in the vertical direction by the collimator lens 132, and then incident on the half mirror 133.
  • the half mirror 133 reflects a part of the incident laser beam L1.
  • the laser beam L1 reflected by the half mirror 133 is incident on the galvano mirror 135.
  • the galvanometer mirror 135 vibrates in the horizontal direction with a predetermined rotation axis as the vibration center by, for example, a drive unit 134 that operates based on control from the control unit 11.
  • the laser beam L1 is horizontally scanned so that the angle of view SR of the laser beam L1 reflected by the galvano mirror 135 reciprocates in the ranging range AR in the horizontal direction.
  • a MEMS Micro Electro Mechanical System
  • a micro motor or the like can be used for the drive unit 134.
  • the laser beam L1 reflected by the galvano mirror 135 is reflected by the object 90 existing in the ranging range AR, and is incident on the galvano mirror 135 as reflected light L2.
  • a part of the reflected light L2 incident on the galvano mirror 135 passes through the half mirror 133 and is incident on the light receiving lens 146, whereby the image is formed on the specific SPAD array 142 in the SPAD array 141.
  • the SPAD array 142 may be the whole or a part of the SPAD array 141.
  • FIG. 3 is a block diagram showing a schematic configuration example of the light-receiving part according to the present embodiment.
  • the light receiving unit 14 includes a SPAD array 141, a timing control circuit 143, a drive circuit 144, and an output circuit 145.
  • the SPAD array 141 includes a plurality of SPAD pixels 20 arranged in a two-dimensional grid pattern.
  • a pixel drive line LD vertical direction in the drawing
  • an output signal line LS horizontal direction in the drawing
  • One end of the pixel drive line LD is connected to the output end corresponding to each column of the drive circuit 144
  • one end of the output signal line LS is connected to the input end corresponding to each line of the output circuit 145.
  • the reflected light L2 is detected by using all or a part of the SPAD array 141.
  • the region used in the SPAD array 141 (SPAD array 142) is a rectangular shape long in the vertical direction, which is the same as the image of the reflected light L2 imaged on the SPAD array 141 when the entire laser light L1 is reflected as the reflected light L2. It may be there.
  • the present invention is not limited to this, and various modifications may be made, such as a region larger or smaller than the image of the reflected light L2 imaged on the SPAD array 141.
  • the drive circuit 144 includes a shift register, an address decoder, and the like, and drives each SPAD pixel 20 of the SPAD array 141 at the same time for all pixels, in a column unit, or the like. Therefore, the drive circuit 144 applies at least a circuit that applies a quench voltage V_QCH described later to each SPAD pixel 20 in the selection row in the SPAD array 141, and a selection control voltage V_SEL described later to each SPAD pixel 20 in the selection row. Includes a circuit to apply. Then, the drive circuit 144 selects the SPAD pixel 20 used for detecting the incident of photons in column units by applying the selection control voltage V_SEL to the pixel drive line LD corresponding to the row to be read.
  • the signal (referred to as a detection signal) V_OUT output from each SPAD pixel 20 in the row selected and scanned by the drive circuit 144 is input to the output circuit 145 through each of the output signal lines LS.
  • the output circuit 145 outputs the detection signal V_OUT input from each SPAD pixel 20 to the SPAD addition unit 40 provided for each macro pixel described later.
  • the timing control circuit 143 includes a timing generator and the like that generate various timing signals, and controls the drive circuit 144 and the output circuit 145 based on the various timing signals generated by the timing generator.
  • FIG. 4 is a schematic diagram showing a schematic configuration example of the SPAD array according to the present embodiment.
  • the SPAD array 142 includes, for example, a configuration in which a plurality of SPAD pixels 20 are arranged in a two-dimensional grid pattern.
  • the plurality of SPAD pixels 20 are grouped into a plurality of macro pixels 30 composed of a predetermined number of SPAD pixels 20 arranged in the row and / or column direction.
  • the shape of the region connecting the outer edges of the SPAD pixels 20 located on the outermost periphery of each macro pixel 30 has a predetermined shape (for example, a rectangle).
  • the SPAD array 142 is composed of, for example, a plurality of macro pixels 30 arranged in the vertical direction (corresponding to the column direction).
  • the SPAD array 142 is divided into, for example, a plurality of regions (hereinafter, referred to as SPAD regions) in the vertical direction.
  • the SPAD array 142 is divided into four SPAD regions 142-1 to 142-4.
  • the lowest SPAD region 142-1 corresponds to, for example, the lowest 1/4 region in the angle of view SR of the SPAD array 142
  • the SPAD region 142-2 above it corresponds to, for example, the lower in the angle of view SR.
  • the SPAD region 142-3 above it corresponds to the second 1/4 region from the bottom, for example, the third 1/4 region from the bottom in the angle of view SR, and the highest SPAD region 142-4 For example, it corresponds to the highest 1/4 area in the angle of view SR.
  • FIG. 5 is a circuit diagram showing a schematic configuration example of the SPAD pixel according to the present embodiment.
  • the SPAD pixel 20 includes a photodiode 21 as a light receiving element and a readout circuit 22 for detecting that a photon is incident on the photodiode 21.
  • the photodiode 21 generates an avalanche current when photons are incident in a state where a reverse bias voltage V_SPAD equal to or higher than the breakdown voltage (breakdown voltage) is applied between the anode and the cathode.
  • the readout circuit 22 includes a quench resistor 23, a digital converter 25, an inverter 26, a buffer 27, and a selection transistor 24.
  • the quench resistor 23 is composed of, for example, an N-type MOSFET (Metal Oxide Semiconductor Field Effect Transistor, hereinafter referred to as an NMOS transistor), its drain is connected to the anode of the photodiode 21, and its source is via the selection transistor 24. It is grounded. Further, a preset quench voltage V_QCH for allowing the NMOS transistor to act as a quench resistor is applied to the gate of the NMOS transistor constituting the quench resistor 23 from the drive circuit 144 via the pixel drive line LD. ..
  • the photodiode 21 is SPAD.
  • the SPAD is an avalanche photodiode that operates in Geiger mode when a reverse bias voltage equal to or higher than the breakdown voltage (breakdown voltage) is applied between its anode and cathode, and can detect the incident of one photon.
  • the digital converter 25 includes a resistor 251 and an NMOS transistor 252.
  • the drain of the NMOS transistor 252 is connected to the power supply voltage VDD via the resistor 251 and its source is grounded. Further, the voltage at the connection point N1 between the anode of the photodiode 21 and the quench resistor 23 is applied to the gate of the NMOS transistor 252.
  • the inverter 26 includes a P-type MOSFET (hereinafter referred to as a MPa transistor) 261 and an NMOS transistor 262.
  • the drain of the NMOS transistor 261 is connected to the power supply voltage VDD, and the source of the epitaxial transistor 261 is connected to the drain of the NMOS transistor 262.
  • the drain of the NMOS transistor 262 is connected to the source of the NMOS transistor 261, and the source is grounded.
  • a voltage at the connection point N2 between the resistor 251 and the drain of the NMOS transistor 252 is applied to the gate of the MOSFET transistor 261 and the gate of the NMOS transistor 262, respectively.
  • the output of the inverter 26 is input to the buffer 27.
  • the buffer 27 is a circuit for impedance conversion, and when an output signal is input from the inverter 26, the input output signal is impedance-converted and output as a detection signal V_OUT.
  • the selection transistor 24 is, for example, an NMOS transistor, and its drain is connected to the source of the NMOS transistor constituting the quench resistor 23, and the source is grounded.
  • the selection transistor 24 is connected to the drive circuit 144, and when the selection control voltage V_SEL from the drive circuit 144 is applied to the gate of the selection transistor 24 via the pixel drive line LD, the selection transistor 24 changes from an off state to an on state. ..
  • the readout circuit 22 illustrated in FIG. 5 operates as follows, for example. That is, first, during the period in which the selective control voltage V_SEL is applied from the drive circuit 144 to the selective transistor 24 and the selective transistor 24 is in the ON state, the photodiode 21 has a reverse bias voltage V_SPAD equal to or higher than the breakdown voltage (breakdown voltage). Is applied. As a result, the operation of the photodiode 21 is permitted.
  • the NMOS transistor 261 changes from the off state to the on state
  • the NMOS transistor 262 changes from the on state to the off state
  • the voltage at the connection point N3 changes to 0V. Changes from to the power supply voltage VDD.
  • the high level detection signal V_OUT is output from the buffer 27.
  • the avalanche current is generated when the photon is incident on the photodiode 21, and the avalanche current is stopped and the NMOS transistor 452 is turned off from the timing when the NMOS transistor 452 is turned on.
  • a high-level detection signal V_OUT is output until the timing becomes.
  • the output detection signal V_OUT is input to the SPAD addition unit 40 for each macro pixel 30 via the output circuit 145. Therefore, the detection signal V_OUT of the number (detection number) of the SPAD pixels 20 in which the incident of photons is detected among the plurality of SPAD pixels 20 constituting one macro pixel 30 is input to each SPAD addition unit 40. ..
  • FIG. 6 is a block diagram showing a more detailed configuration example of the SPAD addition unit according to the present embodiment.
  • the SPAD addition unit 40 may have a configuration included in the light receiving unit 14 or a configuration included in the calculation unit 15.
  • the SPAD addition unit 40 includes, for example, a pulse shaping unit 41 and a light receiving number counting unit 42.
  • the pulse shaping unit 41 shapes the pulse waveform of the detection signal V_OUT input from the SPAD array 141 via the output circuit 145 into a pulse waveform having a time width corresponding to the operating clock of the SPAD adding unit 40.
  • the light receiving number counting unit 42 counts the detection signal V_OUT input from the corresponding macro pixel 30 for each sampling cycle, thereby counting the number (detection number) of the SPAD pixels 20 in which the incident of photons is detected for each sampling cycle. It counts and outputs this count value as a pixel value of the macro pixel 30.
  • the sampling cycle is a cycle for measuring the time (flight time) from the emission of the laser beam L1 by the light emitting unit 13 to the detection of the incident of photons by the light receiving unit 14.
  • a period shorter than the light emission cycle of the light emitting unit 13 is set in this sampling cycle. For example, by shortening the sampling period, it is possible to calculate the flight time of the photon emitted from the light emitting unit 13 and reflected by the object 90 with higher time resolution. This means that by increasing the sampling frequency, it is possible to calculate the distance to the object 90 with a higher ranging resolution.
  • the distance L to the object 90 can be calculated by the following equation (1).
  • L C ⁇ t / 2 (1)
  • the sampling period is 1 ns (nanoseconds). In that case, one sampling period corresponds to 15 cm (centimeter). This indicates that the ranging resolution is 15 cm when the sampling frequency is 1 GHz. Further, when the sampling frequency is doubled to 2 GHz, the sampling period is 0.5 ns (nanoseconds), so one sampling period corresponds to 7.5 cm (centimeters). This indicates that the ranging resolution can be halved when the sampling frequency is doubled. By increasing the sampling frequency and shortening the sampling period in this way, it is possible to calculate the distance to the object 90 more accurately.
  • FIG. 7 is a diagram showing scanning timings of the light emitting unit 13 and the light receiving unit 14, respectively. Note that FIG. 7 shows an example in which the ranging range AR (see FIG. 2) is scanned three times.
  • the control unit 11 controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit 13 among the plurality of light receiving elements and measure the distance. Specifically, the control unit 11 reads out the detection signal of the SPAD pixel 20 included in the SPAD array 142 according to the scanning position of the light emitting unit 13 in the SPAD array 141.
  • the control unit 11 starts scanning the laser beam L1 by the light emitting unit 13, and corresponds to the scanning position of the light emitting unit 13 in the SPAD array 141.
  • the scanning of the light of the light emitting unit 13 is linear, while the scanning of the reading of the light receiving unit 14 is stepwise. That is, the control unit 11 gradually changes the scanning position of the light emitting unit 13 and changes the scanning position (reading position) of the light receiving unit 14 in predetermined angle units. Then, in the example shown in FIG. 7, the control unit 11 reads out the first SPAD array 142 from the time t1 to the time t2.
  • the control unit 11 reads out the SPAD array 142 at the position corresponding to the scanning position of the light emitting unit 13, that is, the second SPAD array 142 located to the right of the leftmost SPAD array 142. After that, similarly, the control unit 11 reads out the SPAD array 142 on the right side at each timing of time t3, time t4, time t5, and time t6, and reads out the rightmost SPAD array 142 at time t6. ..
  • control unit 11 reads out the SPAD array 142 at a position corresponding to the scanning position of the light emitting unit 13 among the light receiving units 14 in which the SPAD pixels 20 are arranged two-dimensionally, so that the light of the light emitting unit 13 is emitted. Can be efficiently received (read) by the light receiving unit 14.
  • FIG. 7 shows a case where the light emitting unit 13 is scanned linearly, for example, as shown in FIG. 8, the light emitting unit 13 may scan the light in a stepped manner.
  • FIG. 8 is a diagram showing scanning timings of the light emitting unit 13 and the light receiving unit 14 according to the modified example.
  • the control unit 11 steps the scanning of light by the light emitting unit 13. That is, the control unit 11 changes the scanning positions of the light emitting unit 13 and the light receiving unit 14 for each predetermined angle. Specifically, as shown in FIG. 8, the control unit 11 emits light in a state where the scanning position of the light emitting unit 13 is fixed at a predetermined angle during the period from time t1 to time t2.
  • control unit 11 reads out the SPAD array 142 at a position corresponding to the scanning position of the light emitting unit 13 from time t1 to time t2.
  • control unit 11 changes the scanning position of the light emitting unit 13 to the next predetermined angle, and reads out the next (next to the right) SPAD array 142 in the light receiving unit 14.
  • control unit 11 makes the scanning position of the light emitting unit 13 step-like according to the scanning of the light receiving unit 14. As a result, the angle of view difference between the light emitting unit 13 and the light receiving unit 14 is eliminated, so that the light of the light emitting unit 13 can be efficiently received by the light receiving unit 14.
  • control unit 11 is not limited to the case where the light emitting unit 13 is scanned in a stepped manner, and for example, as shown in FIG. 9, the scanning position of the light receiving unit 14 may be shifted from the scanning position of the light emitting unit 13. ..
  • FIG. 9 is a diagram showing scanning timings of the light emitting unit 13 and the light receiving unit 14 according to the modified example.
  • the control unit 11 reads out the SPAD array 142 at a position deviated by a predetermined deviation angle ⁇ with respect to the angle of the light emitting unit 13 that scans linearly. That is, the control unit 11 makes the scanning start position of the light emitting unit 13 different from the scanning start position of the light receiving unit 14 by a deviation angle ⁇ . Specifically, as shown in FIG. 9, the control unit 11 starts scanning the light emitting unit 13 from the angle A1 at time t1.
  • the control unit 11 reads out the SPAD array 142 at the position corresponding to the angle B1 at the time t1.
  • the angle B1 is smaller than the angle A1 by a deviation angle ⁇ .
  • the deviation angle ⁇ is an angle interval of the light receiving unit 14, that is, an angle corresponding to approximately half of the amount of change from the angle B1 to the angle B2.
  • the control unit 11 changes the scanning position of the light receiving unit 14 from the angle B1 to the angle B2 and scans. That is, the control unit 11 changes the scanning position of the light receiving unit 14 by a predetermined angle.
  • the deviation between the scanning position of the light emitting unit 13 and the scanning position of the light receiving unit 14 is within the deviation angle ⁇ at the maximum, so that the decrease in the light receiving efficiency by the light receiving unit 14 can be suppressed. Can be done.
  • the scanning timings of the light emitting unit 13 and the light receiving unit 14 shown in FIGS. 7 to 9 are based on the premise that the scanning positions of the light emitting unit 13 and the light receiving unit 14 are substantially parallel, but the scanning positions of the light emitting unit 13 and the light receiving unit 14 are substantially parallel. If they are not parallel, it is necessary to change the scanning timing. This point will be described with reference to FIGS. 10A and 10B.
  • FIG. 10A and 10B are diagrams showing scanning timings of the light emitting unit 13 and the light receiving unit 14 according to the modified example.
  • FIG. 10A shows an example in which the scanning position of the laser beam L1 (reflected light L2) of the light emitting unit 13 is not substantially parallel to the scanning position (SPAD arrays 142a, 142b) of the light receiving unit 14.
  • the control unit 11 divides the SPAD array 142a and the SPAD array 142b into a plurality of regions and reads them out in units of the divided regions.
  • FIG. 10A shows an example in which each of the SPAD arrays 142a and 142b is divided into four SPAD regions 142a-1 to 142a-4 and 142b-1 to 142b-4 (see FIG. 4), but the number of divisions is large. It may be 3 or less or 5 or more.
  • the two lower SPAD regions 142a-1 to 142a-2 and 142b-1 to 142b-2 are referred to as the first regions 142a-10 and 142b-10, respectively, and the upper two SPAD regions are referred to.
  • 142a-3 to 142a-4 and 142b-3 to 142b-4 are referred to as second regions 142a-20 and 142b-20, respectively.
  • the control unit 11 shifts the scanning timing between the first region 142a-10, 142b-10 and the second region 142a-20, 142b-20. Specifically, the control unit 11 scans the first region 142a-10 of the SPAD array 142a and the second region 142b-20 of the SPAD array 142b at the same scanning timing.
  • control unit 11 makes the scanning timings different between the first region 142a-10 and the second region 142a-20 in the SPAD array 142a.
  • control unit 11 may, with respect to the second region 142a-20 of the SPAD array 142a, at the scanning timing before the scanning timing of the first region 142a-10 of the SPAD array 142a and the second region 142b-20 of the SPAD array 142b. Scan.
  • the control unit 11 sets the scanning timing after the scanning timing of the first region 142a-10 of the SPAD array 142a and the second region 142b-20 of the SPAD array 142b. Scan.
  • FIG. 10B shows the scanning timings of the first region 142-10 and the second region 142-20 in one SPAD array 142.
  • control unit 11 starts scanning the laser beam L1 by the light emitting unit 13 at time t1 and scans the first region 142-10 in the SPAD array 142.
  • control unit 11 scans the second region 142-20 in the SPAD array 142 at time t2.
  • the time t2 is substantially intermediate between the time t1 and the time t3.
  • the control unit 11 moves the scanning position of the first region 142-10 in the SPAD array 142 to the right.
  • FIGS. 11 and 12 are views for explaining the abnormality detection method of the light emitting unit 13. For example, if scanning is stopped due to a failure of the drive unit 134 of the light emitting unit 13 or the galvano mirror 135, light is emitted only to a specific position in the ranging range AR. In this case, distance measurement cannot be performed, and the laser beam may not meet the class 1 safety standard.
  • an abnormality in the light emitting part was detected by branching the optical path of the light emitting part and incident a part of the light into a dedicated photodetector or the like to monitor whether the light was correctly projected.
  • control unit 11 detects an abnormality such that the scanning of the light emitting unit 13 is stopped. Specifically, the control unit 11 detects an abnormality in the light emitting unit 13 when the cumulative pixel value based on the detection signals for each predetermined number of SPAD pixels 20 output from the light receiving unit 14 is equal to or greater than a predetermined threshold value.
  • FIGS. 11 and 12 show a histogram generated by the above-mentioned calculation unit 15. Specifically, FIGS. 11 and 12 show a linearized gram of a histogram in which the vertical axis is the cumulative pixel value and the horizontal axis is the time (flight time).
  • FIG. 11 is a graph when the light emitting unit 13 is normal
  • FIG. 12 is a graph when the pulse width of the light emitting unit 13 is widened.
  • the peak P1 when the light emitting unit 13 is normal, the peak P1 corresponding to the object 90 (see FIG. 1) which is a reflector appears.
  • the peak P1 has a peak width close to the pulse width of the laser beam L1.
  • the pulse width of the laser beam L1 becomes wider, and the peak width W of the peak P2 inevitably becomes wider.
  • control unit 11 detects an abnormality in the light emitting unit 13 when the peak width W of the detected peak P2 is equal to or greater than a predetermined threshold value. As a result, the abnormality of the light emitting unit 13 can be detected with high accuracy.
  • the threshold value of the peak width W is set according to the pulse width of the laser beam L1. That is, when the light emitting unit 13 is normal, the peak width of the peak P1 which is the reflected light L2 and the pulse width of the laser light L1 are substantially the same, so that the peak width W of the peak P2 is the pulse width of the laser light L1. If it is wider than a predetermined value, it is determined that the light emitting unit 13 is abnormal.
  • control unit 11 detects the abnormality of the light emitting unit 13 from the detection signal of the light receiving unit 14 for performing distance measurement, a dedicated photodetector for detecting the laser beam is not required as in the conventional case. Therefore, according to the control unit 11 according to the embodiment, it is possible to detect the abnormality of the light emitting unit 13 with high accuracy at low cost.
  • the position of the peak width W may be any position, but for example, abnormality detection is performed by the peak width W of the cumulative pixel value in the threshold value TH (FIG. 13) for peak detection described later. You may go.
  • the abnormality may be detected by the peak width W at the middle (half value) between the peak value and the threshold value TH.
  • the control unit 11 receives light for each row or column of the light receiving unit 14 in which the light receiving elements (SPAD pixels 20) are arranged two-dimensionally. There is a detection method based on the amount of light received by the element.
  • the control unit 11 has a light receiving light amount (cumulative pixel value) of a specific row or column light receiving element among the light receiving elements arranged two-dimensionally in the row and column directions, which is equal to or higher than a predetermined threshold value. In this case, an abnormality in which scanning of the light emitting unit 13 is stopped is detected.
  • the scanning direction of the light emitting unit 13 is the horizontal direction (FIG. 2)
  • the control unit 11 detects an abnormality based on the amount of light received by the light receiving element for each row, and the scanning direction of the light emitting unit 13 is vertical. In the case of the direction (FIG. 15), the abnormality is detected based on the amount of light received by the light receiving element for each row. As a result, it is possible to detect an abnormality in which scanning of the light emitting unit 13 is stopped with high accuracy.
  • FIGS. 13 and 14 are diagrams showing a process of discriminating between the reflected light L2 and the ambient light.
  • Disturbed light referred to here is light caused by the surrounding environment such as sunlight.
  • the reflected light L2 may be buried due to the disturbance light, and the distance may not be measured correctly.
  • accurate distance measurement cannot be performed if the ambient light is strong.
  • control unit 11 discriminates between the peak corresponding to the reflected light L2 and the peak corresponding to the ambient light based on the histogram generated by the calculation unit 15. Specifically, the control unit 11 detects a peak whose calculated value and peak width satisfy a predetermined condition from a plurality of peaks included in the histogram.
  • the control unit 11 first extracts a peak whose cumulative pixel value is equal to or higher than a predetermined threshold value TH from a plurality of peaks included in the histogram. Then, the control unit 11 detects the extracted peak whose peak width is equal to or greater than a predetermined threshold value as the reflected light L2.
  • control unit 11 performs detection processing using the peak width Wth of the cumulative pixel value at the threshold value TH.
  • the control unit 11 may perform the detection process using the peak width Wh in the middle (half value) between the peak value and the threshold value TH.
  • the detection process may be performed using both the peak width Wth and the peak width Wh.
  • control unit 11 detects a peak whose cumulative pixel value is less than a predetermined threshold value TH and a peak whose peak widths Wth and Wh are less than a predetermined threshold value as ambient light. In this way, the control unit 11 can discriminate between the reflected light L2 and the disturbance light by using the cumulative pixel value and the peak width of the histogram. That is, the control unit 11 according to the embodiment can discriminate between the ambient light and the reflected light L2 even in an environment where the ambient light is strong, so that the distance can be measured with high accuracy.
  • control unit 11 may further extract peaks satisfying a predetermined condition when there are a predetermined number or more of peaks whose cumulative pixel value is equal to or higher than a predetermined threshold value TH.
  • the control unit 11 may extract a predetermined number of peaks in the order of shorter time (closer distance) from the peaks whose cumulative pixel value is equal to or higher than the predetermined threshold value TH.
  • control unit 11 may extract a predetermined number of peaks in descending order of the cumulative pixel value from the peaks whose cumulative pixel value is equal to or higher than the predetermined threshold value TH. As a result, a peak with high reliability as the reflected light L2 can be extracted, so that the object 90 can be detected (distance measurement) with higher accuracy.
  • control unit 11 may discriminate between the reflected light L2 and the ambient light based on the shape of the laser light L1 reflected inside the ToF sensor 1, for example. This point will be described with reference to FIG.
  • the control unit 11 first extracts the peak P1 detected at a short distance of less than a predetermined time as the laser beam L1 due to internal reflection. For example, it may be determined whether or not the peak P1 is the laser beam L1 due to internal reflection based on the peak shape at the time of internal reflection of the laser beam L1 obtained in advance by an experiment or the like.
  • the control unit 11 detects the peak P2 having a peak shape similar to that of the peak P1 which is the internal reflection of the laser beam L1 as the reflected light L2. Specifically, the control unit 11 detects peaks P2 having similar feature amounts with respect to the peak shape of peak P1.
  • the feature amount is, for example, information about a cumulative pixel value and a peak width which are peak values.
  • the control unit 11 may image the histogram and search for the peak P2 by pattern matching.
  • the reflected light L2 can be detected with high accuracy.
  • FIG. 15 is a diagram showing the scanning direction of the light emitting unit 13 according to the modified example.
  • FIG. 16 is a diagram showing the scanning direction of the light receiving unit 14 according to the modified example.
  • the laser beam L1 emitted from the galvano mirror 135 of the light emitting unit 13 is rectangular parallel light having a long cross-sectional intensity spectrum in the horizontal direction. Then, the galvano mirror 135 vibrates in the vertical direction with a predetermined rotation axis as the vibration center by, for example, a drive unit 134 (see FIG. 2) that operates based on the control from the control unit 11. As a result, the laser beam L1 is vertically scanned so that the angle of view SR of the laser beam L1 reflected by the galvano mirror 135 reciprocates in the ranging range AR in the vertical direction.
  • the laser beam L1 reflected by the galvano mirror 135 is reflected by the object 90 existing in the ranging range AR, and is incident on the galvano mirror 135 as reflected light L2.
  • the reflected light L2 incident on the galvano mirror 135 is incident on the light receiving unit 14.
  • the light receiving unit 14 reads out the SPAD array 142 in the row corresponding to the scanning position of the light emitting unit 13 among the light receiving elements (SPAD array 141) arranged in two dimensions.
  • the region shape of the SPAD array 142 to be read out has a rectangular shape that is long in the horizontal direction, because the laser beam L1 has a rectangular shape that is long in the horizontal direction. Is. Therefore, the SPAD regions 142-1 to 142-4 in which the SPAD array 142 is divided are divided in the horizontal direction and have a rectangular shape that is long in the horizontal direction.
  • FIG. 17 is a flowchart showing a processing procedure of the entire processing executed by the ToF sensor 1.
  • the light emitting unit 13 emits laser light L1 by emitting light (step S101).
  • the light receiving unit 14 receives the reflected light L2 reflected by the object 90 by the laser light L1 (step S102).
  • the calculation unit 15 generates a histogram of the cumulative pixel value based on the detection signal output from the light receiving unit 14 (step S103).
  • control unit 11 calculates the distance to the object 90 based on the generated histogram (step S104).
  • control unit 11 outputs the calculated distance to the host 80 (step S105), and ends the process.
  • FIG. 18 is a flowchart showing a processing procedure of the abnormality detection process executed by the ToF sensor 1.
  • control unit 11 determines whether or not there is a peak whose cumulative pixel value is equal to or greater than a predetermined threshold value among the plurality of peaks included in the histogram (step S201), and the cumulative pixel value is determined.
  • step S201 No
  • the process ends.
  • step S201 When the cumulative pixel value has a peak of a predetermined threshold value or more (step S201: Yes), the control unit 11 determines whether or not there is a peak having a peak width W of the predetermined threshold value or more among the peaks. (Step S202).
  • step S202 When the peak width W has a peak equal to or larger than a predetermined threshold value (step S202: Yes), the control unit 11 detects that the light emitting unit 13 is abnormal (step S203), and ends the process.
  • control unit 11 detects that the light emitting unit 13 is normal (step S204) when there is no peak whose peak width W is equal to or greater than a predetermined threshold value (step S202: No).
  • control unit 11 detects the reflected light L2 from the peaks whose cumulative pixel value is equal to or higher than a predetermined threshold value (step S205), and ends the process.
  • FIG. 19 is a flowchart showing a processing procedure of the detection process executed by the ToF sensor 1.
  • the control unit 11 determines whether or not there is a peak whose cumulative pixel value is equal to or greater than a predetermined threshold value (step S301), and when there is no peak whose cumulative pixel value is equal to or greater than a predetermined threshold value. (Step S301: No), the process ends.
  • the control unit 11 extracts a predetermined number of peaks in a predetermined order (step S302). For example, the control unit 11 extracts a predetermined number of peaks in descending order of cumulative pixel value or in ascending order of time.
  • control unit 11 determines whether or not the peaks are valid in a predetermined order (step S303). Whether or not it is effective is, for example, whether or not the shape of the peak and the time of the peak satisfy a predetermined condition.
  • step S303: Yes the control unit 11 determines whether or not the peak widths Wth and Wh of the peak are equal to or greater than a predetermined threshold value (step S304).
  • the control unit 11 determines whether or not there is a remaining peak when the peak is invalid (step S303: No) or when the peak widths Wth and Wh are less than a predetermined threshold value (step S304: No). Is determined (step S305). That is, among the peaks extracted in step S302, it is determined whether or not there is a peak for which the determination processing of steps S303 to S304 has not been performed.
  • control unit 11 returns to step S303 when there is a remaining peak (step S305: Yes), and ends the process when there is no remaining peak (step S305: No).
  • step S304 when the peak widths Wth and Wh are equal to or greater than a predetermined threshold value (step S304: Yes), the control unit 11 detects such a peak as reflected light L2 (step S306) and ends the process.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure includes any type of movement such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machines, agricultural machines (tractors), and the like. It may be realized as a device mounted on the body.
  • FIG. 20 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected via the communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an external information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. ..
  • the communication network 7010 connecting these plurality of control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores a program executed by the microcomputer or parameters used for various arithmetics, and a drive circuit that drives various control target devices. To be equipped.
  • Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside or outside the vehicle by wired communication or wireless communication.
  • a communication I / F for performing communication is provided. In FIG.
  • the microcomputer 7610 general-purpose communication I / F 7620, dedicated communication I / F 7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F 7660, audio image output unit 7670, The vehicle-mounted network I / F 7680 and the storage unit 7690 are shown.
  • Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
  • the drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • the vehicle condition detection unit 7110 is connected to the drive system control unit 7100.
  • the vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular velocity of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. Includes at least one of the sensors for detecting angular velocity, engine speed, wheel speed, and the like.
  • the drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110 to control an internal combustion engine, a drive motor, an electric power steering device, a braking device, and the like.
  • the body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as head lamps, back lamps, brake lamps, blinkers or fog lamps.
  • the body system control unit 7200 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.
  • the vehicle outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400.
  • the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the vehicle exterior information detection unit 7420 is used to detect, for example, the current weather or an environmental sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the ambient information detection sensors is included.
  • the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the image pickup unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 21 is a diagram showing an example of installation positions of the image pickup unit 7410 and the vehicle exterior information detection unit 7420.
  • the imaging units 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirrors, rear bumpers, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900.
  • the image pickup unit 7910 provided on the front nose and the image pickup section 7918 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900.
  • the imaging units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900.
  • the image pickup unit 7916 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900.
  • the imaging unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 21 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916.
  • the imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
  • the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided on the side mirrors, respectively
  • the imaging range d indicates the imaging range d.
  • the imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 as viewed from above can be obtained.
  • the vehicle exterior information detection units 7920, 7922, 7924, 7926, 7928, 7930 provided on the front, rear, side, corners of the vehicle 7900 and the upper part of the windshield in the vehicle interior may be, for example, an ultrasonic sensor or a radar device.
  • the vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device.
  • These out-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.
  • the vehicle outside information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the connected vehicle exterior information detection unit 7420. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a lidar device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
  • the vehicle exterior information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information.
  • the vehicle exterior information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc., based on the received information.
  • the vehicle outside information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
  • the vehicle exterior information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data.
  • the vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes the image data captured by different imaging units 7410 to generate a bird's-eye view image or a panoramic image. May be good.
  • the vehicle exterior information detection unit 7400 may perform the viewpoint conversion process using the image data captured by different imaging units 7410.
  • the in-vehicle information detection unit 7500 detects the in-vehicle information.
  • a driver state detection unit 7510 that detects the driver's state is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like.
  • the biosensor is provided on, for example, the seat surface or the steering wheel, and detects the biometric information of the passenger sitting on the seat or the driver holding the steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and may determine whether the driver is dozing or not. You may.
  • the in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
  • the integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs.
  • An input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device such as a touch panel, a button, a microphone, a switch or a lever, which can be input-operated by a passenger. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 7000. You may.
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.
  • the storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750.
  • General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution-Advanced
  • Bluetooth® may be implemented.
  • the general-purpose communication I / F 7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via, for example, a base station or an access point. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology, and is a terminal existing in the vicinity of the vehicle (for example, a terminal of a driver, a pedestrian, or a store, or an MTC (Machine Type Communication) terminal). May be connected with.
  • P2P Peer To Peer
  • MTC Machine Type Communication
  • the dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle.
  • the dedicated communication I / F7630 uses a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of lower layer IEEE802.11p and upper layer IEEE1609. May be implemented.
  • Dedicated communication I / F7630 typically includes vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-home (Vehicle to Home) communication, and pedestrian-to-pedestrian (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.
  • the positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including.
  • the positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
  • the beacon receiving unit 7650 receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic jam, road closure, or required time.
  • the function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.
  • the in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle.
  • the in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB).
  • the in-vehicle device I / F7660 is connected via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)).
  • a wired connection such as -definition Link
  • MHL Mobile High-definition Link
  • the in-vehicle device 7760 includes, for example, at least one of a mobile device or a wearable device owned by a passenger, or an information device carried in or attached to a vehicle.
  • the in-vehicle device 7760 may include a navigation device that searches for a route to an arbitrary destination.
  • the in-vehicle device I / F 7660 is a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.
  • the in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the vehicle-mounted network I / F7680 transmits and receives signals and the like according to a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680. Based on the information acquired in the above, the vehicle control system 7000 is controlled according to various programs. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good.
  • the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. Cooperative control may be performed for the purpose of.
  • the microcomputer 7610 automatically travels autonomously without relying on the driver's operation by controlling the driving force generator, steering mechanism, braking device, etc. based on the acquired information on the surroundings of the vehicle. Coordinated control for the purpose of driving or the like may be performed.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 7610 has information acquired via at least one of a general-purpose communication I / F7620, a dedicated communication I / F7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F7660, and an in-vehicle network I / F7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict a danger such as a vehicle collision, a pedestrian or the like approaching or entering a closed road based on the acquired information, and may generate a warning signal.
  • the warning signal may be, for example, a signal for generating a warning sound or turning on a warning lamp.
  • the audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passengers of the vehicle or outside the vehicle.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices.
  • the display unit 7720 may include, for example, at least one of an onboard display and a heads-up display.
  • the display unit 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices other than these devices, such as headphones, wearable devices such as eyeglass-type displays worn by passengers, and projectors or lamps.
  • the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, or the like into an analog signal and outputs it audibly.
  • At least two control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be composed of a plurality of control units.
  • the vehicle control system 7000 may include another control unit (not shown).
  • the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any control unit.
  • a sensor or device connected to one of the control units may be connected to the other control unit, and the plurality of control units may send and receive detection information to and from each other via the communication network 7010. .
  • a computer program for realizing each function of the ToF sensor 1 according to the present embodiment described with reference to FIG. 1 can be mounted on any control unit or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via, for example, a network without using a recording medium.
  • the ToF sensor 1 according to the present embodiment described with reference to FIG. 1 can be applied to the integrated control unit 7600 of the application example shown in FIG.
  • the control unit 11, the calculation unit 15, and the external I / F 19 of the ToF sensor 1 correspond to the microcomputer 7610, the storage unit 7690, and the in-vehicle network I / F 7680 of the integrated control unit 7600.
  • the present invention is not limited to this, and the vehicle control system 7000 may correspond to the host 80 in FIG.
  • the components of the ToF sensor 1 according to the present embodiment described with reference to FIG. 1 is a module for the integrated control unit 7600 shown in FIG. 20 (for example, an integrated configuration composed of one die). It may be realized in a circuit module). Alternatively, the ToF sensor 1 according to the present embodiment described with reference to FIG. 1 may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG.
  • the distance measuring device (ToF sensor 1) includes a light emitting unit 13, a light receiving unit 14, and a control unit 11.
  • a plurality of light receiving elements (SPAD pixels 20) that receive the reflected light L2 reflected by the light (laser light L1) of the light emitting unit 13 are arranged two-dimensionally.
  • the control unit 11 controls to read out detection signals from a plurality of light receiving elements for each of a predetermined number of light receiving elements and measure the distance.
  • the control unit 11 detects an abnormality in the light emitting unit 13 based on calculated values (pixel values, cumulative pixel values) calculated based on the detection signals of a predetermined number of light receiving elements. As a result, the abnormality of the light emitting unit 13 can be detected with high accuracy at low cost.
  • the distance measuring device includes a light emitting unit 13, a light receiving unit 14, and a control unit 11.
  • the light emitting unit 13 emits light (laser beam L1) that scans a predetermined ranging range.
  • a plurality of light receiving elements (SPAD pixels 20) that receive the reflected light L2 reflected by the light emitting unit 13 are arranged two-dimensionally.
  • the control unit 11 controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit 13 among the plurality of light receiving elements and measure the distance. As a result, light can be efficiently received.
  • the distance measuring device (ToF sensor 1) according to the present embodiment includes a light emitting unit 13, a light receiving unit 14, and a control unit 11.
  • the light receiving unit 14 detects the incident of light (laser beam L1).
  • the control unit 11 controls the distance measurement based on the time from when the light emitting unit 13 emits light until the light receiving unit 14 detects the incident of light.
  • the control unit 11 generates a histogram of the calculated value based on the detection signal output from the light receiving unit 14, and from among the plurality of peaks included in the generated histogram, the calculated value and the peak width are peaks having a predetermined threshold value or more. Is detected. As a result, distance measurement can be performed with high accuracy even in an environment with strong ambient light.
  • the present technology can also have the following configurations.
  • a control unit that reads a detection signal from the plurality of light receiving elements for each of a predetermined number of the light receiving elements and controls distance measurement.
  • the control unit A distance measuring device that detects an abnormality in the light emitting unit based on an calculated value calculated based on the detection signals of the predetermined number of light receiving elements.
  • (2) The control unit The distance measuring device according to (1), wherein a histogram of the calculated values is generated, and an abnormality in the light emitting unit is detected based on the histogram.
  • the control unit The distance measuring device according to (2) above, wherein when the plurality of peaks included in the histogram include a peak whose calculated value and peak width satisfy a predetermined condition, an abnormality in the light emitting unit is detected.
  • the control unit The distance measuring device according to (3) above, wherein when the calculated value includes a peak having a predetermined threshold value or more and the peak width has a predetermined threshold value or more, an abnormality in the light emitting unit is detected.
  • the threshold value of the peak width is a value corresponding to the pulse width of the light of the light emitting unit.
  • the calculated value is The distance measuring device according to any one of (1) to (5) above, which is a pixel value obtained by totaling the number of detection signals output from the light receiving unit for each of a plurality of light receiving elements.
  • the light receiving element is The distance measuring device according to any one of (1) to (6) above, which is a SPAD pixel.
  • the control unit Among the calculated values of the light receiving element for each row or column, when the calculated value of a specific row or column is equal to or higher than a predetermined threshold value, an abnormality of the light emitting unit is detected, according to (1) to (7).
  • the distance measuring device according to any one of the above.
  • Light emitting part and It is a distance measuring method executed by a distance measuring device including a light receiving unit in which a plurality of light receiving elements for receiving the reflected light reflected by the light of the light emitting unit are arranged in a two-dimensional manner. It includes a control step of reading a detection signal from each of a predetermined number of the light receiving elements from the plurality of light receiving elements and controlling the distance measurement. The control step is A distance measuring method for detecting an abnormality in the light emitting unit based on an calculated value calculated based on the detection signals of the predetermined number of light receiving elements.
  • a light emitting unit that emits light that scans a predetermined ranging range
  • a light receiving unit in which a plurality of light receiving elements that receive the reflected light reflected by the light emitting unit are arranged in a two-dimensional manner, and a light receiving unit.
  • a control unit that controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit and measure the distance.
  • a distance measuring device equipped with (11) The control unit The distance measuring device according to (10), wherein the scanning position of the light emitting unit is gradually changed, and the scanning position of the light receiving unit is changed in predetermined angle units.
  • the control unit The distance measuring device according to (11), wherein the scanning start position of the light emitting unit and the scanning start position of the light receiving unit are aligned. (13) The control unit The distance measuring device according to (11) or (12), wherein the scanning start position of the light emitting unit and the scanning start position of the light receiving unit are different from each other. (14) The control unit The distance measurement according to any one of (11) to (13) above, wherein the region corresponding to the scanning position of the light receiving unit is divided into a plurality of regions, and the scanning timing is different for each of the divided regions. Device.
  • the control unit The distance measuring device according to any one of (10) to (14), wherein the scanning position of the light emitting unit is changed in a predetermined angle unit, and the scanning position of the light receiving unit is changed in a predetermined angle unit.
  • a light emitting unit that emits light that scans a predetermined ranging range It is a distance measuring method executed by a distance measuring device including a light receiving unit in which a plurality of light receiving elements for receiving the reflected light reflected by the light of the light emitting unit are arranged in a two-dimensional manner.
  • Distance measurement method including.
  • Light emitting part and A light receiving part that detects the incident of light and A control unit that controls distance measurement based on the time from when the light emitting unit emits light until the light receiving unit detects an incident of light is provided.
  • the control unit A histogram of the calculated value based on the detection signal output from the light receiving unit is generated, and a peak in which the calculated value and the peak width are each equal to or more than a predetermined threshold value is detected from a plurality of peaks included in the generated histogram. , Distance measuring device.
  • the peak width is The distance measuring device according to (17), which is the peak width of the threshold value in the calculated value.
  • the peak width is The distance measuring device according to (17) or (18), which is a peak width at a half value of the calculated value of the peak value and the threshold value in the calculated value.
  • the control unit Any of the above (17) to (19), among the plurality of peaks included in the histogram, a peak whose calculated value is less than a predetermined threshold value or whose peak width is less than a predetermined threshold value is detected as a peak of ambient light.
  • the distance measuring device according to one.
  • the control unit Among the plurality of peaks included in the histogram, the peaks satisfying the predetermined conditions are extracted as the target of the reflected light from the plurality of peaks whose calculated values are equal to or higher than the predetermined threshold values (17) to (20). ).
  • the distance measuring device according to any one of. (22)
  • the control unit The distance measuring device according to (21), wherein a predetermined number of peaks are extracted from a plurality of peaks whose calculated values are equal to or higher than a predetermined threshold value in descending order of the calculated values.
  • (23) The control unit The distance measuring device according to (21) or (22), wherein a predetermined number of peaks are extracted in ascending order of flight time from a plurality of peaks whose calculated values are equal to or higher than a predetermined threshold value.
  • the control unit The measurement according to any one of (19) to (23) above, wherein among the plurality of peaks included in the histogram, a peak having a flight time within a predetermined time and having a shape similar to that of the peak is detected as reflected light.
  • Distance device (25) Light emitting part and It is a distance measuring method performed by a distance measuring device including a light receiving unit for detecting the incident of light. A control step of controlling the distance measurement based on the time from when the light emitting unit emits light until the light receiving unit detects the incident of light is included.
  • the control step is A histogram of the calculated value based on the detection signal output from the light receiving unit is generated, and a peak in which the calculated value and the peak width are each equal to or more than a predetermined threshold value is detected from a plurality of peaks included in the generated histogram. , Distance measurement method.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A rangefinder (1) of the present invention is provided with a light-emitting unit (13), a light-receiving unit (14), and a control unit (11). The light-emitting unit (13) emits light (L1) that scans a specific rangefinding area. The light-receiving unit (14) has multiple light-receiving elements (20) which are two-dimensionally arranged and receive reflected light (L2) resulting from the light emitted from light-emitting unit (13) having been reflected. The control unit (11) controls rangefinding by reading the detection signals from, among multiple light-receiving elements, the light-receiving elements that are at positions corresponding to the scanning position of the light-emitting unit (13).

Description

測距装置および測距方法Distance measuring device and distance measuring method
 本開示は、測距装置および測距方法に関する。 This disclosure relates to a distance measuring device and a distance measuring method.
 従来、LiDAR(Light Detection and Ranging)のように、外部へレーザ光を出射し、反射光を受光することで、反射体である対象物までの距離を測定する測距装置がある。この種の測距装置では、複数のレーザ光を発光する発光部と、レーザ光を受光する1つの受光素子とによって構成されるもの(例えば、特許文献1参照)。 Conventionally, there is a distance measuring device such as LiDAR (Light Detection and Ringing) that measures the distance to an object that is a reflector by emitting laser light to the outside and receiving the reflected light. This type of ranging device is composed of a light emitting unit that emits a plurality of laser beams and one light receiving element that receives the laser beams (see, for example, Patent Document 1).
特表2019-507340号公報Special Table 2019-507340 Gazette
 しかしながら、従来技術では、発光部の光を受光部で効率良く受光する点で改善の余地があった。 However, in the prior art, there is room for improvement in that the light from the light emitting portion is efficiently received by the light receiving portion.
 そこで、本開示では、光を効率良く受光できる測距装置および測距方法を提案する。 Therefore, in the present disclosure, a distance measuring device and a distance measuring method capable of efficiently receiving light are proposed.
 上記の課題を解決するために、本開示に係る一形態の測距装置は、発光部と、受光部と、制御部とを備える。前記発光部は、所定の測距範囲を走査する光を出射する。前記受光部は、前記発光部の光が反射した反射光を受光する複数の受光素子が2次元状に配列される。前記制御部は、前記複数の受光素子のうち、前記発光部の走査位置に応じた位置の前記受光素子から検出信号を読み出して測距する制御を行う。 In order to solve the above problems, the ranging device of one form according to the present disclosure includes a light emitting unit, a light receiving unit, and a control unit. The light emitting unit emits light that scans a predetermined ranging range. In the light receiving unit, a plurality of light receiving elements that receive the reflected light reflected by the light emitting unit are arranged two-dimensionally. The control unit controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit among the plurality of light receiving elements and measure the distance.
本実施形態に係る測距装置としてのToFセンサの概略構成例を示すブロック図である。It is a block diagram which shows the schematic configuration example of the ToF sensor as a distance measuring device which concerns on this embodiment. 本実施形態に係るToFセンサの光学システムを説明するための図である。It is a figure for demonstrating the optical system of the ToF sensor which concerns on this embodiment. 本実施形態に係る受光部の概略構成例を示すブロック図である。It is a block diagram which shows the schematic structure example of the light receiving part which concerns on this embodiment. 本実施形態に係るSPADアレイの概略構成例を示す模式図である。It is a schematic diagram which shows the schematic structure example of the SPAD array which concerns on this embodiment. 本実施形態に係るSPAD画素の概略構成例を示す回路図である。It is a circuit diagram which shows the schematic structure example of the SPAD pixel which concerns on this embodiment. 本実施形態に係るSPAD加算部のより詳細な構成例を示すブロック図である。It is a block diagram which shows the more detailed composition example of the SPAD addition part which concerns on this embodiment. 発光部および受光部それぞれの走査タイミングを示す図である。It is a figure which shows the scanning timing of each of a light emitting part and a light receiving part. 変形例に係る発光部および受光部それぞれの走査タイミングを示す図である。It is a figure which shows the scanning timing of each of the light emitting part and the light receiving part which concerns on a modification. 変形例に係る発光部および受光部それぞれの走査タイミングを示す図である。It is a figure which shows the scanning timing of each of the light emitting part and the light receiving part which concerns on a modification. 変形例に係る発光部および受光部それぞれの走査タイミングを示す図である。It is a figure which shows the scanning timing of each of the light emitting part and the light receiving part which concerns on a modification. 変形例に係る発光部および受光部それぞれの走査タイミングを示す図である。It is a figure which shows the scanning timing of each of the light emitting part and the light receiving part which concerns on a modification. 発光部の異常検出方法を説明する図である。It is a figure explaining the abnormality detection method of a light emitting part. 発光部の異常検出方法を説明する図である。It is a figure explaining the abnormality detection method of a light emitting part. 反射光および外乱光の判別処理を示す図である。It is a figure which shows the discrimination process of reflected light and disturbance light. 反射光および外乱光の判別処理を示す図である。It is a figure which shows the discrimination process of reflected light and disturbance light. 変形例に係る発光部の走査方向を示す図である。It is a figure which shows the scanning direction of the light emitting part which concerns on a modification. 変形例に係る受光部の走査方向を示す図である。It is a figure which shows the scanning direction of the light receiving part which concerns on the modification. ToFセンサが実行する全体処理の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure of the whole processing which ToF sensor executes. ToFセンサが実行する異常検出処理の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure of the abnormality detection processing executed by the ToF sensor. ToFセンサが実行する検出処理の処理手順を示すフローチャートである。It is a flowchart which shows the processing procedure of the detection process executed by a ToF sensor. 本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。It is a block diagram which shows the schematic structure example of the vehicle control system which is an example of the mobile body control system to which the technique which concerns on this disclosure can be applied. 撮像部及び車外情報検出部の設置位置の例を示す図である。It is a figure which shows the example of the installation position of the image pickup unit and the vehicle exterior information detection unit.
 以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。 Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. In each of the following embodiments, the same parts are designated by the same reference numerals, so that duplicate description will be omitted.
 また、本明細書及び図面において、実質的に同一の機能構成を有する複数の構成要素を、同一の符号の後に異なる数字を付して区別する場合もある。ただし、実質的に同一の機能構成を有する複数の構成要素の各々を特に区別する必要がない場合、同一符号のみを付する。 Further, in the present specification and drawings, a plurality of components having substantially the same functional configuration may be distinguished by adding different numbers after the same reference numerals. However, if it is not necessary to distinguish each of the plurality of components having substantially the same functional configuration, only the same reference numerals are given.
 なお、説明は以下の順序で行うものとする。
  1.実施形態
   1.1 測距装置(ToFセンサ)
   1.2 光学システム
   1.3 受光部
   1.4 SPADアレイ
   1.5 SPAD画素
   1.6 SPAD画素の概略動作例
   1.7 SPAD加算部
   1.8 サンプリング周期
   1.9 発光部および受光部の走査タイミング
   1.10 発光部の異常検出
   1.11 反射光および外乱光の判別
  2.応用例
  3.まとめ
The explanations will be given in the following order.
1. 1. Embodiment 1.1 Distance measuring device (ToF sensor)
1.2 Optical system 1.3 Light receiving part 1.4 SPAD array 1.5 SPAD pixel 1.6 Schematic operation example of SPAD pixel 1.7 SPAD adding part 1.8 Sampling cycle 1.9 Scanning of light emitting part and light receiving part Timing 1.10 Abnormality detection of light emitting part 1.11 Discrimination between reflected light and ambient light 2. Application example 3. summary
 1.実施形態
 まず、実施形態について、以下に図面を参照して詳細に説明する。 
1. 1. Embodiment First, the embodiment will be described in detail with reference to the drawings below.
 1.1 測距装置(ToFセンサ)
 図1は、本実施形態に係る測距装置としてのToFセンサの概略構成例を示すブロック図である。図1に示すように、ToFセンサ1は、制御部11と、発光部13と、受光部14と、演算部15と、外部インタフェース(I/F)19とを備える。 
1.1 Distance measuring device (ToF sensor)
FIG. 1 is a block diagram showing a schematic configuration example of a ToF sensor as a distance measuring device according to the present embodiment. As shown in FIG. 1, the ToF sensor 1 includes a control unit 11, a light emitting unit 13, a light receiving unit 14, a calculation unit 15, and an external interface (I / F) 19.
 制御部11は、例えば、CPU(Central Processing Unit)などの情報処理装置で構成され、ToFセンサ1の各部を制御する。  The control unit 11 is composed of, for example, an information processing device such as a CPU (Central Processing Unit), and controls each unit of the ToF sensor 1. Twice
 外部I/F19は、例えば、無線LAN(Local Area Network)や有線LANの他、CAN(Controller Area Network)、LIN(Local Interconnect Network)、FlexRay(登録商標)等の任意の規格に準拠した通信ネットワークを介して外部のホスト80と通信を確立するための通信アダプタであってよい。  The external I / F19 is, for example, a communication network compliant with any standard such as wireless LAN (Local Area Network), wired LAN, CAN (Controller Area Network), LIN (Local Interconnect Network), FlexRay (registered trademark), etc. It may be a communication adapter for establishing communication with an external host 80 via. Twice
 ここで、ホスト80は、例えば、ToFセンサ1が自動車等に実装される場合には、自動車等に搭載されているECU(Engine Control Unit)などであってよい。また、ToFセンサ1が家庭内ペットロボットなどの自律移動ロボットやロボット掃除機や無人航空機や追従運搬ロボットなどの自律移動体に搭載されている場合には、ホスト80は、その自律移動体を制御する制御装置等であってよい。  Here, the host 80 may be, for example, an ECU (Engine Control Unit) mounted on the automobile or the like when the ToF sensor 1 is mounted on the automobile or the like. When the ToF sensor 1 is mounted on an autonomous mobile robot such as a domestic pet robot or an autonomous mobile body such as a robot vacuum cleaner, an unmanned aerial vehicle, or a follow-up transport robot, the host 80 controls the autonomous mobile body. It may be a control device or the like. Twice
 発光部13は、例えば、1つ又は複数の半導体レーザダイオードを光源として備えており、所定時間幅のパルス状のレーザ光L1を所定周期(発光周期ともいう)で出射する。また、発光部13は、例えば、1MHz(メガヘルツ)の周期で、1ns(ナノ秒)の時間幅のレーザ光L1を出射する。発光部13から出射したレーザ光L1は、例えば、測距範囲内に物体90が存在する場合には、この物体90で反射して、反射光L2として、受光部14に入射する。  The light emitting unit 13 includes, for example, one or a plurality of semiconductor laser diodes as a light source, and emits pulsed laser light L1 having a predetermined time width at a predetermined period (also referred to as a light emitting cycle). Further, the light emitting unit 13 emits the laser beam L1 having a time width of 1 ns (nanosecond) in a period of, for example, 1 MHz (megahertz). For example, when an object 90 exists within the ranging range, the laser beam L1 emitted from the light emitting unit 13 is reflected by the object 90 and is incident on the light receiving unit 14 as reflected light L2. Twice
 受光部14は、その詳細については後述するが、例えば、2次元格子状に配列した複数のSPAD画素を備え、発光部13の発光後にフォトンの入射を検出したSPAD画素の数(以下、検出数という)に関する情報(例えば、後述における検出信号の数に相当)を出力する。受光部14は、例えば、発光部13の1回の発光に対し、所定のサンプリング周期でフォトンの入射を検出してその検出数を出力する。  The details of the light receiving unit 14 will be described later, but for example, the number of SPAD pixels that include a plurality of SPAD pixels arranged in a two-dimensional grid and detect the incident of photons after the light emitting unit 13 emits light (hereinafter, the number of detections). Information about (for example, corresponding to the number of detection signals described later) is output. For example, the light receiving unit 14 detects the incident of photons at a predetermined sampling cycle for one light emission of the light emitting unit 13 and outputs the detected number. Twice
 演算部15は、受光部14から出力された検出数を複数のSPAD画素(例えば、後述する1又は複数のマクロ画素に相当)ごとに集計し、その集計により得られた画素値に基づいて、横軸を飛行時間とし、縦軸を累積画素値としたヒストグラムを作成する。例えば、演算部15は、発光部13の1回の発光に対して所定のサンプリング周波数で検出数を集計して画素値を求めることを、発光部13の複数回の発光に対して繰返し実行することで、横軸(ヒストグラムのビン)を飛行時間に相当するサンプリング周期とし、縦軸を各サンプリング周期で求められた画素値を累積することで得られた累積画素値としたヒストグラムを作成する。 The calculation unit 15 aggregates the number of detections output from the light receiving unit 14 for each of a plurality of SPAD pixels (for example, corresponding to one or a plurality of macro pixels described later), and based on the pixel value obtained by the aggregation, the calculation unit 15 is used. Create a histogram with the horizontal axis as the flight time and the vertical axis as the cumulative pixel value. For example, the calculation unit 15 repeatedly executes the calculation of the pixel value by totaling the number of detections at a predetermined sampling frequency for one light emission of the light emitting unit 13 for a plurality of light emissiones of the light emitting unit 13. As a result, a histogram is created in which the horizontal axis (histogram bin) is the sampling cycle corresponding to the flight time, and the vertical axis is the cumulative pixel value obtained by accumulating the pixel values obtained in each sampling cycle.
 また、演算部15は、作成したヒストグラムに対して所定のフィルタ処理を施した後、フィルタ処理後のヒストグラムから累積画素値がピークとなる際の飛行時間を特定する。そして、演算部15は、特定した飛行時間に基づいて、ToFセンサ1又はこれを搭載するデバイスから測距範囲内に存在する物体90までの距離を算出する。なお、演算部15で算出された距離の情報は、例えば、外部I/F19を介してホスト80等に出力されてもよい。 Further, the calculation unit 15 applies a predetermined filter process to the created histogram, and then specifies the flight time when the cumulative pixel value peaks from the filtered histogram. Then, the calculation unit 15 calculates the distance from the ToF sensor 1 or the device on which the ToF sensor 1 is mounted to the object 90 existing in the distance measuring range based on the specified flight time. The distance information calculated by the calculation unit 15 may be output to the host 80 or the like via the external I / F 19, for example.
 1.2 光学システム
 図2は、本実施形態に係るToFセンサの光学システムを説明するための図である。なお、図2では、受光部14の画角を水平方向に走査する、いわゆるスキャン型の光学システムを例示するが、これに限定されず、例えば、受光部14の画角が固定された、いわゆるフラッシュ型のToFセンサとすることも可能である。
1.2 Optical system FIG. 2 is a diagram for explaining an optical system of the ToF sensor according to the present embodiment. Note that FIG. 2 illustrates a so-called scan-type optical system that scans the angle of view of the light receiving unit 14 in the horizontal direction, but the present invention is not limited to this, and for example, the angle of view of the light receiving unit 14 is fixed, so-called. It is also possible to use a flash type ToF sensor.
 図2に示すように、ToFセンサ1は、光学システムとして、光源131と、コリメータレンズ132と、ハーフミラー133と、ガルバノミラー135と、受光レンズ146と、SPADアレイ141とを備える。光源131、コリメータレンズ132、ハーフミラー133及びガルバノミラー135は、例えば、図1における発光部13に含まれる。また、受光レンズ146及びSPADアレイ141は、例えば、図1における受光部14に含まれる。 As shown in FIG. 2, the ToF sensor 1 includes a light source 131, a collimator lens 132, a half mirror 133, a galvano mirror 135, a light receiving lens 146, and a SPAD array 141 as an optical system. The light source 131, the collimator lens 132, the half mirror 133, and the galvano mirror 135 are included in the light emitting unit 13 in FIG. 1, for example. Further, the light receiving lens 146 and the SPAD array 141 are included in the light receiving unit 14 in FIG. 1, for example.
 図2に示す構成において、光源131から出射したレーザ光L1は、コリメータレンズ132により、断面の強度スペクトルが垂直方向に長い矩形の平行光に変換され、その後、ハーフミラー133に入射する。ハーフミラー133は、入射したレーザ光L1の一部を反射する。ハーフミラー133で反射したレーザ光L1は、ガルバノミラー135に入射する。ガルバノミラー135は、例えば、制御部11からの制御に基づいて動作する駆動部134により、所定の回転軸を振動中心として水平方向に振動する。これにより、ガルバノミラー135で反射したレーザ光L1の画角SRが測距範囲ARを水平方向に往復走査するように、レーザ光L1が水平走査される。なお、駆動部134には、MEMS(Micro Electro Mechanical System)やマイクロモーター等を用いることができる。 In the configuration shown in FIG. 2, the laser beam L1 emitted from the light source 131 is converted into rectangular parallel light whose cross-sectional intensity spectrum is long in the vertical direction by the collimator lens 132, and then incident on the half mirror 133. The half mirror 133 reflects a part of the incident laser beam L1. The laser beam L1 reflected by the half mirror 133 is incident on the galvano mirror 135. The galvanometer mirror 135 vibrates in the horizontal direction with a predetermined rotation axis as the vibration center by, for example, a drive unit 134 that operates based on control from the control unit 11. As a result, the laser beam L1 is horizontally scanned so that the angle of view SR of the laser beam L1 reflected by the galvano mirror 135 reciprocates in the ranging range AR in the horizontal direction. A MEMS (Micro Electro Mechanical System), a micro motor, or the like can be used for the drive unit 134.
 ガルバノミラー135で反射したレーザ光L1は、測距範囲AR内に存在する物体90で反射し、反射光L2としてガルバノミラー135に入射する。ガルバノミラー135に入射した反射光L2の一部は、ハーフミラー133を透過して受光レンズ146に入射し、それにより、SPADアレイ141における特定のSPADアレイ142に結像される。なお、SPADアレイ142は、SPADアレイ141の全体であってもよいし、一部であってもよい。 The laser beam L1 reflected by the galvano mirror 135 is reflected by the object 90 existing in the ranging range AR, and is incident on the galvano mirror 135 as reflected light L2. A part of the reflected light L2 incident on the galvano mirror 135 passes through the half mirror 133 and is incident on the light receiving lens 146, whereby the image is formed on the specific SPAD array 142 in the SPAD array 141. The SPAD array 142 may be the whole or a part of the SPAD array 141.
 1.3 受光部
 図3は、本実施形態に係る受光部の概略構成例を示すブロック図である。図3に示すように、受光部14は、SPADアレイ141と、タイミング制御回路143と、駆動回路144と、出力回路145とを備える。
1.3 Light-receiving part FIG. 3 is a block diagram showing a schematic configuration example of the light-receiving part according to the present embodiment. As shown in FIG. 3, the light receiving unit 14 includes a SPAD array 141, a timing control circuit 143, a drive circuit 144, and an output circuit 145.
 SPADアレイ141は、2次元格子状に配列する複数のSPAD画素20を備える。複数のSPAD画素20に対しては、列ごとに画素駆動線LD(図面中の上下方向)が接続され、行ごとに出力信号線LS(図面中の左右方向)が接続される。画素駆動線LDの一端は、駆動回路144の各列に対応した出力端に接続され、出力信号線LSの一端は、出力回路145の各行に対応した入力端に接続される。 The SPAD array 141 includes a plurality of SPAD pixels 20 arranged in a two-dimensional grid pattern. For the plurality of SPAD pixels 20, a pixel drive line LD (vertical direction in the drawing) is connected for each column, and an output signal line LS (horizontal direction in the drawing) is connected for each row. One end of the pixel drive line LD is connected to the output end corresponding to each column of the drive circuit 144, and one end of the output signal line LS is connected to the input end corresponding to each line of the output circuit 145.
 本実施形態では、SPADアレイ141の全部又は一部を使用して、反射光L2を検出する。SPADアレイ141における使用する領域(SPADアレイ142)は、レーザ光L1全体が反射光L2として反射された場合にSPADアレイ141に結像される反射光L2の像と同じ、垂直方向に長い矩形であってよい。ただし、これに限定されず、SPADアレイ141に結像される反射光L2の像よりも大きな領域や小さな領域など、種々変形されてよい。 In this embodiment, the reflected light L2 is detected by using all or a part of the SPAD array 141. The region used in the SPAD array 141 (SPAD array 142) is a rectangular shape long in the vertical direction, which is the same as the image of the reflected light L2 imaged on the SPAD array 141 when the entire laser light L1 is reflected as the reflected light L2. It may be there. However, the present invention is not limited to this, and various modifications may be made, such as a region larger or smaller than the image of the reflected light L2 imaged on the SPAD array 141.
 駆動回路144は、シフトレジスタやアドレスデコーダなどを含み、SPADアレイ141の各SPAD画素20を、全画素同時や列単位等で駆動する。そこで、駆動回路144は、少なくとも、SPADアレイ141内の選択列における各SPAD画素20に、後述するクエンチ電圧V_QCHを印加する回路と、選択列における各SPAD画素20に、後述する選択制御電圧V_SELを印加する回路とを含む。そして、駆動回路144は、読出し対象の列に対応する画素駆動線LDに選択制御電圧V_SELを印加することで、フォトンの入射を検出するために用いるSPAD画素20を列単位で選択する。 The drive circuit 144 includes a shift register, an address decoder, and the like, and drives each SPAD pixel 20 of the SPAD array 141 at the same time for all pixels, in a column unit, or the like. Therefore, the drive circuit 144 applies at least a circuit that applies a quench voltage V_QCH described later to each SPAD pixel 20 in the selection row in the SPAD array 141, and a selection control voltage V_SEL described later to each SPAD pixel 20 in the selection row. Includes a circuit to apply. Then, the drive circuit 144 selects the SPAD pixel 20 used for detecting the incident of photons in column units by applying the selection control voltage V_SEL to the pixel drive line LD corresponding to the row to be read.
 駆動回路144によって選択走査された列の各SPAD画素20から出力される信号(検出信号という)V_OUTは、出力信号線LSの各々を通して出力回路145に入力される。出力回路145は、各SPAD画素20から入力された検出信号V_OUTを、後述するマクロ画素ごとに設けられたSPAD加算部40へ出力する。 The signal (referred to as a detection signal) V_OUT output from each SPAD pixel 20 in the row selected and scanned by the drive circuit 144 is input to the output circuit 145 through each of the output signal lines LS. The output circuit 145 outputs the detection signal V_OUT input from each SPAD pixel 20 to the SPAD addition unit 40 provided for each macro pixel described later.
 タイミング制御回路143は、各種のタイミング信号を生成するタイミングジェネレータ等を含み、タイミングジェネレータで生成された各種のタイミング信号を基に、駆動回路144及び出力回路145を制御する。 The timing control circuit 143 includes a timing generator and the like that generate various timing signals, and controls the drive circuit 144 and the output circuit 145 based on the various timing signals generated by the timing generator.
 1.4 SPADアレイ
 図4は、本実施形態に係るSPADアレイの概略構成例を示す模式図である。図4に示すように、SPADアレイ142は、例えば、複数のSPAD画素20が2次元格子状に配列した構成を備える。複数のSPAD画素20は、行及び/又は列方向に配列する所定数ずつのSPAD画素20で構成された複数のマクロ画素30にグループ化されている。各マクロ画素30の最外周に位置するSPAD画素20の外側の縁を結んだ領域の形状は、所定の形状(例えば、矩形)をなしている。
1.4 SPAD Array FIG. 4 is a schematic diagram showing a schematic configuration example of the SPAD array according to the present embodiment. As shown in FIG. 4, the SPAD array 142 includes, for example, a configuration in which a plurality of SPAD pixels 20 are arranged in a two-dimensional grid pattern. The plurality of SPAD pixels 20 are grouped into a plurality of macro pixels 30 composed of a predetermined number of SPAD pixels 20 arranged in the row and / or column direction. The shape of the region connecting the outer edges of the SPAD pixels 20 located on the outermost periphery of each macro pixel 30 has a predetermined shape (for example, a rectangle).
 SPADアレイ142は、例えば、垂直方向(列方向に相当)に配列する複数のマクロ画素30で構成されている。本実施形態では、SPADアレイ142は、例えば、垂直方向に複数の領域(以下、SPAD領域という)に分割されている。図4に示す例では、SPADアレイ142が4つのSPAD領域142-1~142-4に分割されている。最下に位置するSPAD領域142-1は、例えば、SPADアレイ142の画角SRにおける最下の1/4領域に相当し、その上のSPAD領域142-2は、例えば、画角SRにおける下から2番目の1/4領域に相当し、その上のSPAD領域142-3は、例えば、画角SRにおける下から3番目の1/4領域に相当し、最上のSPAD領域142-4は、例えば、画角SRにおける最上の1/4領域に相当する。 The SPAD array 142 is composed of, for example, a plurality of macro pixels 30 arranged in the vertical direction (corresponding to the column direction). In the present embodiment, the SPAD array 142 is divided into, for example, a plurality of regions (hereinafter, referred to as SPAD regions) in the vertical direction. In the example shown in FIG. 4, the SPAD array 142 is divided into four SPAD regions 142-1 to 142-4. The lowest SPAD region 142-1 corresponds to, for example, the lowest 1/4 region in the angle of view SR of the SPAD array 142, and the SPAD region 142-2 above it corresponds to, for example, the lower in the angle of view SR. The SPAD region 142-3 above it corresponds to the second 1/4 region from the bottom, for example, the third 1/4 region from the bottom in the angle of view SR, and the highest SPAD region 142-4 For example, it corresponds to the highest 1/4 area in the angle of view SR.
 1.5 SPAD画素
 図5は、本実施形態に係るSPAD画素の概略構成例を示す回路図である。図5に示すように、SPAD画素20は、受光素子としてのフォトダイオード21と、フォトダイオード21にフォトンが入射したことを検出する読出し回路22とを備える。フォトダイオード21は、そのアノードとカソードとの間に降伏電圧(ブレークダウン電圧)以上の逆バイアス電圧V_SPADが印加されている状態でフォトンが入射すると、アバランシェ電流を発生する。
1.5 SPAD Pixel FIG. 5 is a circuit diagram showing a schematic configuration example of the SPAD pixel according to the present embodiment. As shown in FIG. 5, the SPAD pixel 20 includes a photodiode 21 as a light receiving element and a readout circuit 22 for detecting that a photon is incident on the photodiode 21. The photodiode 21 generates an avalanche current when photons are incident in a state where a reverse bias voltage V_SPAD equal to or higher than the breakdown voltage (breakdown voltage) is applied between the anode and the cathode.
 読出し回路22は、クエンチ抵抗23と、デジタル変換器25と、インバータ26と、バッファ27と、選択トランジスタ24とを備える。クエンチ抵抗23は、例えば、N型のMOSFET(Metal Oxide Semiconductor Field Effect Transistor。以下、NMOSトランジスタという)で構成され、そのドレインがフォトダイオード21のアノードに接続され、そのソースが選択トランジスタ24を介して接地されている。また、クエンチ抵抗23を構成するNMOSトランジスタのゲートには、当該NMOSトランジスタをクエンチ抵抗として作用させるために予め設定されているクエンチ電圧V_QCHが、駆動回路144から画素駆動線LDを介して印加される。 The readout circuit 22 includes a quench resistor 23, a digital converter 25, an inverter 26, a buffer 27, and a selection transistor 24. The quench resistor 23 is composed of, for example, an N-type MOSFET (Metal Oxide Semiconductor Field Effect Transistor, hereinafter referred to as an NMOS transistor), its drain is connected to the anode of the photodiode 21, and its source is via the selection transistor 24. It is grounded. Further, a preset quench voltage V_QCH for allowing the NMOS transistor to act as a quench resistor is applied to the gate of the NMOS transistor constituting the quench resistor 23 from the drive circuit 144 via the pixel drive line LD. ..
 本実施形態において、フォトダイオード21はSPADである。SPADは、そのアノードとカソードとの間に降伏電圧(ブレークダウン電圧)以上の逆バイアス電圧が印加されるとガイガーモードで動作するアバランシェフォトダイオードであり、1つのフォトンの入射を検出可能である。 In this embodiment, the photodiode 21 is SPAD. The SPAD is an avalanche photodiode that operates in Geiger mode when a reverse bias voltage equal to or higher than the breakdown voltage (breakdown voltage) is applied between its anode and cathode, and can detect the incident of one photon.
 デジタル変換器25は、抵抗251とNMOSトランジスタ252とを備える。NMOSトランジスタ252は、そのドレインが抵抗251を介して電源電圧VDDに接続され、そのソースが接地されている。また、NMOSトランジスタ252のゲートには、フォトダイオード21のアノードとクエンチ抵抗23との接続点N1の電圧が印加される。 The digital converter 25 includes a resistor 251 and an NMOS transistor 252. The drain of the NMOS transistor 252 is connected to the power supply voltage VDD via the resistor 251 and its source is grounded. Further, the voltage at the connection point N1 between the anode of the photodiode 21 and the quench resistor 23 is applied to the gate of the NMOS transistor 252.
 インバータ26は、P型のMOSFET(以下、PMOSトランジスタという)261とNMOSトランジスタ262とを備える。PMOSトランジスタ261は、そのドレインが電源電圧VDDに接続され、そのソースがNMOSトランジスタ262のドレインに接続されている。NMOSトランジスタ262は、そのドレインがPMOSトランジスタ261のソースに接続され、そのソースが接地されている。PMOSトランジスタ261のゲート及びNMOSトランジスタ262のゲートには、それぞれ抵抗251とNMOSトランジスタ252のドレインとの接続点N2の電圧が印加される。インバータ26の出力は、バッファ27に入力される。 The inverter 26 includes a P-type MOSFET (hereinafter referred to as a MPa transistor) 261 and an NMOS transistor 262. The drain of the NMOS transistor 261 is connected to the power supply voltage VDD, and the source of the epitaxial transistor 261 is connected to the drain of the NMOS transistor 262. The drain of the NMOS transistor 262 is connected to the source of the NMOS transistor 261, and the source is grounded. A voltage at the connection point N2 between the resistor 251 and the drain of the NMOS transistor 252 is applied to the gate of the MOSFET transistor 261 and the gate of the NMOS transistor 262, respectively. The output of the inverter 26 is input to the buffer 27.
 バッファ27は、インピーダンス変換のための回路であり、インバータ26から出力信号を入力すると、その入力した出力信号をインピーダンス変換し、検出信号V_OUTとして出力する。 The buffer 27 is a circuit for impedance conversion, and when an output signal is input from the inverter 26, the input output signal is impedance-converted and output as a detection signal V_OUT.
 選択トランジスタ24は、例えば、NMOSトランジスタであり、そのドレインがクエンチ抵抗23を構成するNMOSトランジスタのソースに接続され、そのソースが接地されている。選択トランジスタ24は、駆動回路144に接続されており、選択トランジスタ24のゲートに駆動回路144からの選択制御電圧V_SELが画素駆動線LDを介して印加されると、オフ状態からオン状態に変化する。 The selection transistor 24 is, for example, an NMOS transistor, and its drain is connected to the source of the NMOS transistor constituting the quench resistor 23, and the source is grounded. The selection transistor 24 is connected to the drive circuit 144, and when the selection control voltage V_SEL from the drive circuit 144 is applied to the gate of the selection transistor 24 via the pixel drive line LD, the selection transistor 24 changes from an off state to an on state. ..
 1.6 SPAD画素の概略動作例
 図5に例示した読出し回路22は、例えば、以下のように動作する。すなわち、まず、駆動回路144から選択トランジスタ24に選択制御電圧V_SELが印加されて選択トランジスタ24がオン状態となっている期間、フォトダイオード21には降伏電圧(ブレークダウン電圧)以上の逆バイアス電圧V_SPADが印加される。これにより、フォトダイオード21の動作が許可される。
1.6 Schematic operation example of the SPAD pixel The readout circuit 22 illustrated in FIG. 5 operates as follows, for example. That is, first, during the period in which the selective control voltage V_SEL is applied from the drive circuit 144 to the selective transistor 24 and the selective transistor 24 is in the ON state, the photodiode 21 has a reverse bias voltage V_SPAD equal to or higher than the breakdown voltage (breakdown voltage). Is applied. As a result, the operation of the photodiode 21 is permitted.
 一方、駆動回路144から選択トランジスタ24に選択制御電圧V_SELが印加されておらず、選択トランジスタ24がオフ状態となっている期間、逆バイアス電圧V_SPADがフォトダイオード21に印加されないことから、フォトダイオード21の動作が禁止される。 On the other hand, since the selective control voltage V_SEL is not applied to the selection transistor 24 from the drive circuit 144 and the reverse bias voltage V_SPAD is not applied to the photodiode 21 while the selection transistor 24 is in the off state, the photodiode 21 Operation is prohibited.
 選択トランジスタ24がオン状態であるときにフォトダイオード21にフォトンが入射すると、フォトダイオード21においてアバランシェ電流が発生する。それにより、クエンチ抵抗23にアバランシェ電流が流れ、接続点N1の電圧が上昇する。接続点N1の電圧がNMOSトランジスタ252のオン電圧よりも高くなると、NMOSトランジスタ252がオン状態になり、接続点N2の電圧が電源電圧VDDから0Vに変化する。そして、接続点N2の電圧が電源電圧VDDから0Vに変化すると、PMOSトランジスタ261がオフ状態からオン状態に変化すると共にNMOSトランジスタ262がオン状態からオフ状態に変化し、接続点N3の電圧が0Vから電源電圧VDDに変化する。その結果、バッファ27からハイレベルの検出信号V_OUTが出力される。 If photons are incident on the photodiode 21 while the selection transistor 24 is on, an avalanche current is generated in the photodiode 21. As a result, an avalanche current flows through the quench resistor 23, and the voltage at the connection point N1 rises. When the voltage at the connection point N1 becomes higher than the on voltage of the NMOS transistor 252, the NMOS transistor 252 is turned on and the voltage at the connection point N2 changes from the power supply voltage VDD to 0V. Then, when the voltage at the connection point N2 changes from the power supply voltage VDD to 0V, the NMOS transistor 261 changes from the off state to the on state, the NMOS transistor 262 changes from the on state to the off state, and the voltage at the connection point N3 changes to 0V. Changes from to the power supply voltage VDD. As a result, the high level detection signal V_OUT is output from the buffer 27.
 その後、接続点N1の電圧が上昇し続けると、フォトダイオード21のアノードとカソードとの間に印加されている電圧が降伏電圧よりも小さくなり、それにより、アバランシェ電流が止まって、接続点N1の電圧が低下する。そして、接続点N1の電圧がNMOSトランジスタ452のオン電圧よりも低くなると、NMOSトランジスタ452がオフ状態になり、バッファ27からの検出信号V_OUTの出力が停止する(ローレベル)。 After that, when the voltage at the connection point N1 continues to rise, the voltage applied between the anode and the cathode of the photodiode 21 becomes smaller than the breakdown voltage, whereby the avalanche current stops and the connection point N1 The voltage drops. Then, when the voltage at the connection point N1 becomes lower than the on voltage of the NMOS transistor 452, the NMOS transistor 452 is turned off and the output of the detection signal V_OUT from the buffer 27 is stopped (low level).
 このように、読出し回路22は、フォトダイオード21にフォトンが入射してアバランシェ電流が発生し、これによりNMOSトランジスタ452がオン状態になったタイミングから、アバランシェ電流が止まってNMOSトランジスタ452がオフ状態になるタイミングまでの期間、ハイレベルの検出信号V_OUTを出力する。出力された検出信号V_OUTは、出力回路145を介して、マクロ画素30ごとのSPAD加算部40に入力される。したがって、各SPAD加算部40には、1つのマクロ画素30を構成する複数のSPAD画素20のうちでフォトンの入射が検出されたSPAD画素20の数(検出数)の検出信号V_OUTが入力される。 In this way, in the readout circuit 22, the avalanche current is generated when the photon is incident on the photodiode 21, and the avalanche current is stopped and the NMOS transistor 452 is turned off from the timing when the NMOS transistor 452 is turned on. A high-level detection signal V_OUT is output until the timing becomes. The output detection signal V_OUT is input to the SPAD addition unit 40 for each macro pixel 30 via the output circuit 145. Therefore, the detection signal V_OUT of the number (detection number) of the SPAD pixels 20 in which the incident of photons is detected among the plurality of SPAD pixels 20 constituting one macro pixel 30 is input to each SPAD addition unit 40. ..
 1.7 SPAD加算部
 図6は、本実施形態に係るSPAD加算部のより詳細な構成例を示すブロック図である。なお、SPAD加算部40は、受光部14に含まれる構成であってもよいし、演算部15に含まれる構成であってもよい。
1.7 SPAD addition unit FIG. 6 is a block diagram showing a more detailed configuration example of the SPAD addition unit according to the present embodiment. The SPAD addition unit 40 may have a configuration included in the light receiving unit 14 or a configuration included in the calculation unit 15.
 図6に示すように、SPAD加算部40は、例えば、パルス整形部41と、受光数カウント部42とを備える。 As shown in FIG. 6, the SPAD addition unit 40 includes, for example, a pulse shaping unit 41 and a light receiving number counting unit 42.
 パルス整形部41は、SPADアレイ141から出力回路145を介して入力した検出信号V_OUTのパルス波形を、SPAD加算部40の動作クロックに応じた時間幅のパルス波形に整形する。 The pulse shaping unit 41 shapes the pulse waveform of the detection signal V_OUT input from the SPAD array 141 via the output circuit 145 into a pulse waveform having a time width corresponding to the operating clock of the SPAD adding unit 40.
 受光数カウント部42は、対応するマクロ画素30からサンプリング周期ごとに入力された検出信号V_OUTをカウントすることで、フォトンの入射が検出されたSPAD画素20の個数(検出数)をサンプリング周期ごとに計数し、この計数値をマクロ画素30の画素値として出力する。 The light receiving number counting unit 42 counts the detection signal V_OUT input from the corresponding macro pixel 30 for each sampling cycle, thereby counting the number (detection number) of the SPAD pixels 20 in which the incident of photons is detected for each sampling cycle. It counts and outputs this count value as a pixel value of the macro pixel 30.
 1.8 サンプリング周期
 ここで、サンプリング周期とは、発光部13がレーザ光L1を出射してから受光部14でフォトンの入射が検出されるまでの時間(飛行時間)を計測する周期である。このサンプリング周期には、発光部13の発光周期よりも短い周期が設定される。例えば、サンプリング周期をより短くすることで、より高い時間分解能で、発光部13から出射して物体90で反射したフォトンの飛行時間を算出することが可能となる。これは、サンプリング周波数をより高くすることで、より高い測距分解能で物体90までの距離を算出することが可能となることを意味している。
1.8 Sampling cycle Here, the sampling cycle is a cycle for measuring the time (flight time) from the emission of the laser beam L1 by the light emitting unit 13 to the detection of the incident of photons by the light receiving unit 14. A period shorter than the light emission cycle of the light emitting unit 13 is set in this sampling cycle. For example, by shortening the sampling period, it is possible to calculate the flight time of the photon emitted from the light emitting unit 13 and reflected by the object 90 with higher time resolution. This means that by increasing the sampling frequency, it is possible to calculate the distance to the object 90 with a higher ranging resolution.
 例えば、発光部13がレーザ光L1を出射して、このレーザ光L1が物体90で反射し、この反射光L2が受光部14に入射するまでの飛行時間をtとすると、光速Cが一定(C≒300,000,000m(メートル)/s(秒)であることから、物体90までの距離Lは、以下の式(1)ように算出することができる。
L=C×t/2  (1)
For example, assuming that the flight time until the light emitting unit 13 emits the laser light L1, the laser light L1 is reflected by the object 90, and the reflected light L2 is incident on the light receiving unit 14, the speed of light C is constant ( Since C≈300,000,000 m (meters) / s (seconds), the distance L to the object 90 can be calculated by the following equation (1).
L = C × t / 2 (1)
 そこで、サンプリング周波数を1GHzとすると、サンプリング周期は1ns(ナノ秒)となる。その場合、1つのサンプリング周期は、15cm(センチメートル)に相当する。これは、サンプリング周波数を1GHzとした場合の測距分解能が15cmであることを示している。また、サンプリング周波数を2倍の2GHzとすると、サンプリング周期は0.5ns(ナノ秒)となるため、1つのサンプリング周期は、7.5cm(センチメートル)に相当する。これは、サンプリング周波数を2倍とした場合、測距分解能を1/2にすることができることを示している。このように、サンプリング周波数を高くしてサンプリング周期を短くすることで、より精度良く物体90までの距離を算出することが可能となる。 Therefore, if the sampling frequency is 1 GHz, the sampling period is 1 ns (nanoseconds). In that case, one sampling period corresponds to 15 cm (centimeter). This indicates that the ranging resolution is 15 cm when the sampling frequency is 1 GHz. Further, when the sampling frequency is doubled to 2 GHz, the sampling period is 0.5 ns (nanoseconds), so one sampling period corresponds to 7.5 cm (centimeters). This indicates that the ranging resolution can be halved when the sampling frequency is doubled. By increasing the sampling frequency and shortening the sampling period in this way, it is possible to calculate the distance to the object 90 more accurately.
 1.9 発光部および受光部の走査タイミング
 ここで、図7を用いて、発光部13および受光部14それぞれの走査タイミングについて説明する。図7は、発光部13および受光部14それぞれの走査タイミングを示す図である。なお、図7では、測距範囲AR(図2参照)を3回走査した例を示している。
1.9 Scanning timing of the light emitting unit and the light receiving unit Here, the scanning timings of the light emitting unit 13 and the light receiving unit 14 will be described with reference to FIG. 7. FIG. 7 is a diagram showing scanning timings of the light emitting unit 13 and the light receiving unit 14, respectively. Note that FIG. 7 shows an example in which the ranging range AR (see FIG. 2) is scanned three times.
 制御部11は、複数の受光素子のうち、発光部13の走査位置に応じた位置の受光素子から検出信号を読み出して測距する制御を行う。具体的には、制御部11は、SPADアレイ141のうち、発光部13の走査位置に応じたSPADアレイ142に含まれるSPAD画素20の検出信号を読み出す。 The control unit 11 controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit 13 among the plurality of light receiving elements and measure the distance. Specifically, the control unit 11 reads out the detection signal of the SPAD pixel 20 included in the SPAD array 142 according to the scanning position of the light emitting unit 13 in the SPAD array 141.
 より具体的には、図7に示すように、時刻t1において、制御部11は、発光部13によりレーザ光L1の走査を開始するとともに、SPADアレイ141のうち、発光部13の走査位置に対応する位置のSPADアレイ142を読み出す。例えば、発光部13が測距範囲ARを左端から右端に向かって走査する場合、時刻t1において読み出す位置は、SPADアレイ141のうち、左端の1番目のSPADアレイ142である。つまり、制御部11は、発光部13の走査開始位置(角度)と、受光部14の走査開始位置(角度)とを揃える。 More specifically, as shown in FIG. 7, at time t1, the control unit 11 starts scanning the laser beam L1 by the light emitting unit 13, and corresponds to the scanning position of the light emitting unit 13 in the SPAD array 141. Read the SPAD array 142 at the desired position. For example, when the light emitting unit 13 scans the ranging range AR from the left end to the right end, the position to be read out at time t1 is the first SPAD array 142 on the left end of the SPAD array 141. That is, the control unit 11 aligns the scanning start position (angle) of the light emitting unit 13 with the scanning start position (angle) of the light receiving unit 14.
 そして、図7に示すように、発光部13の光の走査が直線状であるのに対して、受光部14の読み出しの走査は、階段状となる。つまり、制御部11は、発光部13の走査位置を漸変させ、受光部14の走査位置(読み出し位置)を所定角度単位で変化させる。そして、図7に示す例において、制御部11は、時刻t1から時刻t2までは、1番目のSPADアレイ142を読み出す。 Then, as shown in FIG. 7, the scanning of the light of the light emitting unit 13 is linear, while the scanning of the reading of the light receiving unit 14 is stepwise. That is, the control unit 11 gradually changes the scanning position of the light emitting unit 13 and changes the scanning position (reading position) of the light receiving unit 14 in predetermined angle units. Then, in the example shown in FIG. 7, the control unit 11 reads out the first SPAD array 142 from the time t1 to the time t2.
 そして、制御部11は、時刻t2において、発光部13の走査位置に対応する位置のSPADアレイ142、すなわち、左端のSPADアレイ142の右隣に位置する2番目のSPADアレイ142を読み出す。以降、同様に、制御部11は、時刻t3、時刻t4、時刻t5および時刻t6それぞれのタイミングで右隣のSPADアレイ142を読み出していき、時刻t6において、右端のSPADアレイ142を読み出すようにする。 Then, at time t2, the control unit 11 reads out the SPAD array 142 at the position corresponding to the scanning position of the light emitting unit 13, that is, the second SPAD array 142 located to the right of the leftmost SPAD array 142. After that, similarly, the control unit 11 reads out the SPAD array 142 on the right side at each timing of time t3, time t4, time t5, and time t6, and reads out the rightmost SPAD array 142 at time t6. ..
 このように、制御部11は、SPAD画素20が2次元状に配列された受光部14のうち、発光部13の走査位置に応じた位置のSPADアレイ142を読み出すことで、発光部13の光を受光部14において効率良く受光する(読み出す)ことができる。 In this way, the control unit 11 reads out the SPAD array 142 at a position corresponding to the scanning position of the light emitting unit 13 among the light receiving units 14 in which the SPAD pixels 20 are arranged two-dimensionally, so that the light of the light emitting unit 13 is emitted. Can be efficiently received (read) by the light receiving unit 14.
 なお、図7では、発光部13を直線状に走査する場合を示したが、例えば、図8に示すように、発光部13による光の走査を階段状にしてもよい。図8は、変形例に係る発光部13および受光部14それぞれの走査タイミングを示す図である。 Although FIG. 7 shows a case where the light emitting unit 13 is scanned linearly, for example, as shown in FIG. 8, the light emitting unit 13 may scan the light in a stepped manner. FIG. 8 is a diagram showing scanning timings of the light emitting unit 13 and the light receiving unit 14 according to the modified example.
 図8に示すように、制御部11は、発光部13による光の走査を階段状にする。つまり、制御部11は、発光部13および受光部14それぞれの走査位置を所定角度毎に変化させる。具体的には、図8に示すように、制御部11は、時刻t1から時刻t2までの期間において、所定の角度で発光部13の走査位置を固定した状態で光を出射する。 As shown in FIG. 8, the control unit 11 steps the scanning of light by the light emitting unit 13. That is, the control unit 11 changes the scanning positions of the light emitting unit 13 and the light receiving unit 14 for each predetermined angle. Specifically, as shown in FIG. 8, the control unit 11 emits light in a state where the scanning position of the light emitting unit 13 is fixed at a predetermined angle during the period from time t1 to time t2.
 また、制御部11は、時刻t1から時刻t2において、発光部13の走査位置に対応する位置のSPADアレイ142を読み出す。 Further, the control unit 11 reads out the SPAD array 142 at a position corresponding to the scanning position of the light emitting unit 13 from time t1 to time t2.
 そして、制御部11は、時刻t2において、発光部13の走査位置を次の所定の角度に変更するとともに、受光部14における次(右隣)のSPADアレイ142を読み出す。 Then, at time t2, the control unit 11 changes the scanning position of the light emitting unit 13 to the next predetermined angle, and reads out the next (next to the right) SPAD array 142 in the light receiving unit 14.
 すなわち、制御部11は、発光部13の走査位置を受光部14の走査に合わせて階段状にする。これにより、発光部13と受光部14との画角差が無くなるため、発光部13の光を受光部14において効率良く受光することができる。 That is, the control unit 11 makes the scanning position of the light emitting unit 13 step-like according to the scanning of the light receiving unit 14. As a result, the angle of view difference between the light emitting unit 13 and the light receiving unit 14 is eliminated, so that the light of the light emitting unit 13 can be efficiently received by the light receiving unit 14.
 また、制御部11は、発光部13を階段状に走査する場合に限らず、例えば、図9に示すように、受光部14の走査位置を発光部13の走査位置に対してずらしてもよい。図9は、変形例に係る発光部13および受光部14それぞれの走査タイミングを示す図である。 Further, the control unit 11 is not limited to the case where the light emitting unit 13 is scanned in a stepped manner, and for example, as shown in FIG. 9, the scanning position of the light receiving unit 14 may be shifted from the scanning position of the light emitting unit 13. .. FIG. 9 is a diagram showing scanning timings of the light emitting unit 13 and the light receiving unit 14 according to the modified example.
 図9に示すように、制御部11は、直線状に走査する発光部13の角度に対して、所定のずれ角αだけずれた位置のSPADアレイ142を読み出すようにする。つまり、制御部11は、発光部13の走査開始位置と、受光部14の走査開始位置とをずれ角αだけ異ならせる。具体的には、図9に示すように、制御部11は、時刻t1において、角度A1から発光部13の走査を開始する。 As shown in FIG. 9, the control unit 11 reads out the SPAD array 142 at a position deviated by a predetermined deviation angle α with respect to the angle of the light emitting unit 13 that scans linearly. That is, the control unit 11 makes the scanning start position of the light emitting unit 13 different from the scanning start position of the light receiving unit 14 by a deviation angle α. Specifically, as shown in FIG. 9, the control unit 11 starts scanning the light emitting unit 13 from the angle A1 at time t1.
 一方、制御部11は、時刻t1において、角度B1に対応する位置のSPADアレイ142を読み出すようにする。なお、角度B1は、角度A1に対してずれ角αだけ小さい。より具体的には、ずれ角αは、受光部14の角度間隔、すなわち、角度B1から角度B2への変化量の略半分に相当する角度である。 On the other hand, the control unit 11 reads out the SPAD array 142 at the position corresponding to the angle B1 at the time t1. The angle B1 is smaller than the angle A1 by a deviation angle α. More specifically, the deviation angle α is an angle interval of the light receiving unit 14, that is, an angle corresponding to approximately half of the amount of change from the angle B1 to the angle B2.
 そして、制御部11は、発光部13の走査位置が角度A2に到達した時刻t2において、受光部14の走査位置を角度B1から角度B2へ変更し走査する。つまり、制御部11は、受光部14の走査位置を所定角度毎に変化させる。 Then, at the time t2 when the scanning position of the light emitting unit 13 reaches the angle A2, the control unit 11 changes the scanning position of the light receiving unit 14 from the angle B1 to the angle B2 and scans. That is, the control unit 11 changes the scanning position of the light receiving unit 14 by a predetermined angle.
 これにより、時刻t1から時刻t2の間において、発光部13の走査位置と、受光部14の走査位置とのずれが最大でもずれ角αに収まるため、受光部14による受光効率の低下を抑えることができる。 As a result, between the time t1 and the time t2, the deviation between the scanning position of the light emitting unit 13 and the scanning position of the light receiving unit 14 is within the deviation angle α at the maximum, so that the decrease in the light receiving efficiency by the light receiving unit 14 can be suppressed. Can be done.
 なお、図7~図9で示した発光部13および受光部14の走査タイミングは、それぞれの走査位置が略平行である場合を前提としているが、発光部13および受光部14の走査位置が略平行ではない場合には、走査タイミングを変更する必要がある。かかる点について、図10Aおよび図10Bを用いて説明する。 The scanning timings of the light emitting unit 13 and the light receiving unit 14 shown in FIGS. 7 to 9 are based on the premise that the scanning positions of the light emitting unit 13 and the light receiving unit 14 are substantially parallel, but the scanning positions of the light emitting unit 13 and the light receiving unit 14 are substantially parallel. If they are not parallel, it is necessary to change the scanning timing. This point will be described with reference to FIGS. 10A and 10B.
 図10Aおよび図10Bは、変形例に係る発光部13および受光部14それぞれの走査タイミングを示す図である。図10Aでは、発光部13のレーザ光L1(反射光L2)の走査位置が、受光部14の走査位置(SPADアレイ142a,142b)に対して略平行ではない例を示している。 10A and 10B are diagrams showing scanning timings of the light emitting unit 13 and the light receiving unit 14 according to the modified example. FIG. 10A shows an example in which the scanning position of the laser beam L1 (reflected light L2) of the light emitting unit 13 is not substantially parallel to the scanning position ( SPAD arrays 142a, 142b) of the light receiving unit 14.
 図10Aに示すレーザ光L1の走査位置に対して、仮に、SPADアレイ142aまたはSPADアレイ142bのいずれを読み出したとしても受光効率が高いとはいえない。 Even if either the SPAD array 142a or the SPAD array 142b is read out with respect to the scanning position of the laser beam L1 shown in FIG. 10A, it cannot be said that the light receiving efficiency is high.
 そこで、制御部11は、SPADアレイ142aおよびSPADアレイ142bを複数の領域に分割し、分割した領域の単位で読み出すこととした。図10Aでは、各SPADアレイ142a,142bが、それぞれ4つのSPAD領域142a-1~142a-4,142b-1~142b-4(図4参照)に分割された例を示しているが分割数は3つ以下でも5つ以上でもよい。 Therefore, the control unit 11 divides the SPAD array 142a and the SPAD array 142b into a plurality of regions and reads them out in units of the divided regions. FIG. 10A shows an example in which each of the SPAD arrays 142a and 142b is divided into four SPAD regions 142a-1 to 142a-4 and 142b-1 to 142b-4 (see FIG. 4), but the number of divisions is large. It may be 3 or less or 5 or more.
 なお、図10Aでは、下側の2つのSPAD領域142a-1~142a-2,142b-1~142b-2を、それぞれ第1領域142a-10,142b-10と称し、上側の2つのSPAD領域142a-3~142a-4,142b-3~142b-4を、それぞれ第2領域142a-20,142b-20と称する。 In FIG. 10A, the two lower SPAD regions 142a-1 to 142a-2 and 142b-1 to 142b-2 are referred to as the first regions 142a-10 and 142b-10, respectively, and the upper two SPAD regions are referred to. 142a-3 to 142a-4 and 142b-3 to 142b-4 are referred to as second regions 142a-20 and 142b-20, respectively.
 制御部11は、第1領域142a-10,142b-10と、第2領域142a-20,142b-20とで走査タイミングをずらすようにする。具体的には、制御部11は、SPADアレイ142aの第1領域142a-10と、SPADアレイ142bの第2領域142b-20とを同じ走査タイミングで走査する。 The control unit 11 shifts the scanning timing between the first region 142a-10, 142b-10 and the second region 142a-20, 142b-20. Specifically, the control unit 11 scans the first region 142a-10 of the SPAD array 142a and the second region 142b-20 of the SPAD array 142b at the same scanning timing.
 換言すれば、制御部11は、SPADアレイ142aにおける第1領域142a-10と、第2領域142a-20とで走査タイミングを異ならせる。例えば、制御部11は、SPADアレイ142aの第2領域142a-20については、SPADアレイ142aの第1領域142a-10およびSPADアレイ142bの第2領域142b-20の走査タイミングの前の走査タイミングで走査する。 In other words, the control unit 11 makes the scanning timings different between the first region 142a-10 and the second region 142a-20 in the SPAD array 142a. For example, the control unit 11 may, with respect to the second region 142a-20 of the SPAD array 142a, at the scanning timing before the scanning timing of the first region 142a-10 of the SPAD array 142a and the second region 142b-20 of the SPAD array 142b. Scan.
 また、制御部11は、SPADアレイ142bの第1領域142b-10については、SPADアレイ142aの第1領域142a-10およびSPADアレイ142bの第2領域142b-20の走査タイミングの後の走査タイミングで走査する。 Further, regarding the first region 142b-10 of the SPAD array 142b, the control unit 11 sets the scanning timing after the scanning timing of the first region 142a-10 of the SPAD array 142a and the second region 142b-20 of the SPAD array 142b. Scan.
 より具体的に、図10Bを用いて説明する。図10Bでは、1つのSPADアレイ142における第1領域142-10および第2領域142-20の走査タイミングを示している。 More specifically, it will be described with reference to FIG. 10B. FIG. 10B shows the scanning timings of the first region 142-10 and the second region 142-20 in one SPAD array 142.
 図10Bに示すように、制御部11は、時刻t1において、発光部13によるレーザ光L1の走査を開始するとともに、SPADアレイ142における第1領域142-10の走査を行う。 As shown in FIG. 10B, the control unit 11 starts scanning the laser beam L1 by the light emitting unit 13 at time t1 and scans the first region 142-10 in the SPAD array 142.
 そして、制御部11は、時刻t2において、SPADアレイ142における第2領域142-20の走査を行う。なお、時刻t2は、時刻t1および時刻t3の略中間である。そして、制御部11は、時刻t3において、SPADアレイ142における第1領域142-10の走査位置を右隣りへ移動する。 Then, the control unit 11 scans the second region 142-20 in the SPAD array 142 at time t2. The time t2 is substantially intermediate between the time t1 and the time t3. Then, at time t3, the control unit 11 moves the scanning position of the first region 142-10 in the SPAD array 142 to the right.
 これにより、発光部13の光L1の走査位置がSPADアレイ142に対して略平行でない場合であっても、受光部14における受光効率の低下を抑えることができる。 As a result, even when the scanning position of the light L1 of the light emitting unit 13 is not substantially parallel to the SPAD array 142, it is possible to suppress a decrease in the light receiving efficiency of the light receiving unit 14.
 1.10 発光部の異常検出
 次に、図11および図12を用いて、発光部13の異常検出方法について説明する。図11および図12は、発光部13の異常検出方法を説明する図である。例えば、発光部13の駆動部134やガルバノミラー135の故障により走査が停止してしまうと、測距範囲ARにおける特定位置のみへ光が出射されることとなる。この場合、測距を行えなくなる他、レーザ光がクラス1の安全規格を満たせなくおそれがある。
1.10 Abnormality detection of the light emitting unit Next, a method of detecting an abnormality of the light emitting unit 13 will be described with reference to FIGS. 11 and 12. 11 and 12 are views for explaining the abnormality detection method of the light emitting unit 13. For example, if scanning is stopped due to a failure of the drive unit 134 of the light emitting unit 13 or the galvano mirror 135, light is emitted only to a specific position in the ranging range AR. In this case, distance measurement cannot be performed, and the laser beam may not meet the class 1 safety standard.
 この点について、従来は、発光部の光路を分岐し一部の光を専用のフォトディテクタなどに入射させ、正しく光が投光されているかモニタリングすることで発光部の異常を検出していた。 Regarding this point, in the past, an abnormality in the light emitting part was detected by branching the optical path of the light emitting part and incident a part of the light into a dedicated photodetector or the like to monitor whether the light was correctly projected.
 しかしながら、従来技術では、発光部の様々な故障要因に対応するのが難しく、また専用の部品の追加が必要になるため装置のコストアップ、効率低下という課題があった。 However, with the conventional technology, it is difficult to deal with various failure factors of the light emitting part, and since it is necessary to add special parts, there are problems that the cost of the device is increased and the efficiency is decreased.
 そこで、制御部11は、発光部13の走査が停止するような異常を検出する。具体的には、制御部11は、受光部14から出力される所定数のSPAD画素20毎の検出信号に基づく累積画素値が所定の閾値以上の場合、発光部13の異常を検出する。 Therefore, the control unit 11 detects an abnormality such that the scanning of the light emitting unit 13 is stopped. Specifically, the control unit 11 detects an abnormality in the light emitting unit 13 when the cumulative pixel value based on the detection signals for each predetermined number of SPAD pixels 20 output from the light receiving unit 14 is equal to or greater than a predetermined threshold value.
 図11および図12では、上述した演算部15によって生成されるヒストグラムを示している。具体的には、図11および図12では、縦軸を累積画素値、横軸を時間(飛行時間)とするヒストグラムを線形化したグラムを示している。 11 and 12 show a histogram generated by the above-mentioned calculation unit 15. Specifically, FIGS. 11 and 12 show a linearized gram of a histogram in which the vertical axis is the cumulative pixel value and the horizontal axis is the time (flight time).
 なお、図11は、発光部13が正常である場合のグラフであり、図12は、発光部13のパルス幅が広くなる異常である場合のグラフである。図11に示すように、発光部13が正常である場合、反射体である物体90(図1参照)に対応するピークP1が現れる。かかるピークP1は、レーザ光L1のパルス幅に近いピーク幅となる。しかしながら、図12に示すように、発光部13に異常が発生した場合、レーザ光L1のパルス幅が広くなり、必然的にピークP2のピーク幅Wも広くなる。 Note that FIG. 11 is a graph when the light emitting unit 13 is normal, and FIG. 12 is a graph when the pulse width of the light emitting unit 13 is widened. As shown in FIG. 11, when the light emitting unit 13 is normal, the peak P1 corresponding to the object 90 (see FIG. 1) which is a reflector appears. The peak P1 has a peak width close to the pulse width of the laser beam L1. However, as shown in FIG. 12, when an abnormality occurs in the light emitting unit 13, the pulse width of the laser beam L1 becomes wider, and the peak width W of the peak P2 inevitably becomes wider.
 制御部11は、かかる点に着目し、検出されるピークP2のピーク幅Wが所定の閾値以上である場合には、発光部13の異常を検出する。これにより、発光部13の異常を精度良く検出することができる。 Focusing on this point, the control unit 11 detects an abnormality in the light emitting unit 13 when the peak width W of the detected peak P2 is equal to or greater than a predetermined threshold value. As a result, the abnormality of the light emitting unit 13 can be detected with high accuracy.
 なお、ピーク幅Wの閾値は、レーザ光L1のパルス幅に応じた値が設定される。つまり、発光部13が正常である場合、反射光L2であるピークP1のピーク幅と、レーザ光L1のパルス幅は略同程度となるため、ピークP2のピーク幅Wがレーザ光L1のパルス幅よりも所定値以上広い場合には、発光部13の異常と判定する。 The threshold value of the peak width W is set according to the pulse width of the laser beam L1. That is, when the light emitting unit 13 is normal, the peak width of the peak P1 which is the reflected light L2 and the pulse width of the laser light L1 are substantially the same, so that the peak width W of the peak P2 is the pulse width of the laser light L1. If it is wider than a predetermined value, it is determined that the light emitting unit 13 is abnormal.
 また、制御部11は、測距を行うための受光部14の検出信号から発光部13の異常を検出するため、従来のように、レーザ光を検出する専用のフォトディテクタが必要ない。従って、実施形態に係る制御部11によれば、低コストで高精度に発光部13の異常を検出することができる。 Further, since the control unit 11 detects the abnormality of the light emitting unit 13 from the detection signal of the light receiving unit 14 for performing distance measurement, a dedicated photodetector for detecting the laser beam is not required as in the conventional case. Therefore, according to the control unit 11 according to the embodiment, it is possible to detect the abnormality of the light emitting unit 13 with high accuracy at low cost.
 なお、ピーク幅Wの位置(累積画素値)は、任意の位置であってよいが、例えば、後述するピーク検出のための閾値TH(図13)における累積画素値のピーク幅Wにより異常検出を行ってもよい。あるいは、ピーク値と、閾値THとの中間(半値)におけるピーク幅Wにより異常検出を行ってもよい。 The position of the peak width W (cumulative pixel value) may be any position, but for example, abnormality detection is performed by the peak width W of the cumulative pixel value in the threshold value TH (FIG. 13) for peak detection described later. You may go. Alternatively, the abnormality may be detected by the peak width W at the middle (half value) between the peak value and the threshold value TH.
 また、制御部11は、発光部13の走査が停止している異常を検出する方法として、受光素子(SPAD画素20)が2次元状に配列された受光部14の行毎または列毎の受光素子の受光光量に基づいた検出方法がある。 Further, as a method of detecting an abnormality in which scanning of the light emitting unit 13 is stopped, the control unit 11 receives light for each row or column of the light receiving unit 14 in which the light receiving elements (SPAD pixels 20) are arranged two-dimensionally. There is a detection method based on the amount of light received by the element.
 具体的には、制御部11は、行および列方向の2次元状に配列された受光素子のうち、特定の行または列の受光素子の受光光量(累積画素値)が所定の閾値以上である場合に、発光部13の走査が停止している異常を検出する。なお、制御部11は、発光部13の走査方向が水平方向(図2)である場合には、列毎の受光素子の受光光量に基づいて異常を検出し、発光部13の走査方向が鉛直方向(図15)である場合には、行毎の受光素子の受光光量に基づいて異常を検出する。これにより、発光部13の走査が停止している異常を高精度に検出することができる。 Specifically, the control unit 11 has a light receiving light amount (cumulative pixel value) of a specific row or column light receiving element among the light receiving elements arranged two-dimensionally in the row and column directions, which is equal to or higher than a predetermined threshold value. In this case, an abnormality in which scanning of the light emitting unit 13 is stopped is detected. When the scanning direction of the light emitting unit 13 is the horizontal direction (FIG. 2), the control unit 11 detects an abnormality based on the amount of light received by the light receiving element for each row, and the scanning direction of the light emitting unit 13 is vertical. In the case of the direction (FIG. 15), the abnormality is detected based on the amount of light received by the light receiving element for each row. As a result, it is possible to detect an abnormality in which scanning of the light emitting unit 13 is stopped with high accuracy.
 1.11 反射光および外乱光の判別
 次に、図13および図14を用いて、反射光L2および外乱光の判別処理について説明する。図13および図14は、反射光L2および外乱光の判別処理を示す図である。ここでいう外乱光とは、太陽光等の周囲の環境に起因する光である。かかる外乱光が比較的強い場合、上記したヒストグラムにおいて、外乱光が原因で反射光L2が埋もれてしまい、正しく測距できなくなるおそれがある。特に、物体90までの距離が遠く、十分な光量の反射光L2を得られない場合には、外乱光が強いと正確な測距が行えない。
1.11 Discrimination between reflected light and ambient light Next, the process of discriminating between the reflected light L2 and the ambient light will be described with reference to FIGS. 13 and 14. 13 and 14 are diagrams showing a process of discriminating between the reflected light L2 and the ambient light. Disturbed light referred to here is light caused by the surrounding environment such as sunlight. When such ambient light is relatively strong, in the above-mentioned histogram, the reflected light L2 may be buried due to the disturbance light, and the distance may not be measured correctly. In particular, when the distance to the object 90 is long and the reflected light L2 with a sufficient amount of light cannot be obtained, accurate distance measurement cannot be performed if the ambient light is strong.
 そこで、制御部11は、演算部15によって生成されたヒストグラムに基づいて、反射光L2に相当するピークと外乱光に相当するピークとを判別する。具体的には、制御部11は、ヒストグラムに含まれる複数のピークの中から、演算値およびピーク幅が所定の条件を満たすピークを検出する。 Therefore, the control unit 11 discriminates between the peak corresponding to the reflected light L2 and the peak corresponding to the ambient light based on the histogram generated by the calculation unit 15. Specifically, the control unit 11 detects a peak whose calculated value and peak width satisfy a predetermined condition from a plurality of peaks included in the histogram.
 具体的には、図13に示すように、制御部11は、まず、ヒストグラムに含まれる複数のピークの中から、累積画素値が所定の閾値TH以上のピークを抽出する。そして、制御部11は、抽出したピークのうち、ピーク幅が所定の閾値以上となるピークを反射光L2として検出する。 Specifically, as shown in FIG. 13, the control unit 11 first extracts a peak whose cumulative pixel value is equal to or higher than a predetermined threshold value TH from a plurality of peaks included in the histogram. Then, the control unit 11 detects the extracted peak whose peak width is equal to or greater than a predetermined threshold value as the reflected light L2.
 例えば、制御部11は、閾値THにおける累積画素値のピーク幅Wthを用いて検出処理を行う。あるいは、制御部11は、ピーク値および閾値THの中間(半値)におけるピーク幅Whを用いて検出処理を行ってもよい。あるいは、ピーク幅Wthおよびピーク幅Wh双方を用いて検出処理を行ってもよい。 For example, the control unit 11 performs detection processing using the peak width Wth of the cumulative pixel value at the threshold value TH. Alternatively, the control unit 11 may perform the detection process using the peak width Wh in the middle (half value) between the peak value and the threshold value TH. Alternatively, the detection process may be performed using both the peak width Wth and the peak width Wh.
 換言すれば、制御部11は、累積画素値が所定の閾値TH未満のピークや、ピーク幅Wth,Whが所定の閾値未満のピークを外乱光として検出する。このように、制御部11は、ヒストグラムの累積画素値およびピーク幅を用いることで、反射光L2および外乱光を判別することができる。すなわち、実施形態に係る制御部11は、外乱光が強い環境下であっても、外乱光と反射光L2とを判別できるため、高精度に測距を行うことができる。 In other words, the control unit 11 detects a peak whose cumulative pixel value is less than a predetermined threshold value TH and a peak whose peak widths Wth and Wh are less than a predetermined threshold value as ambient light. In this way, the control unit 11 can discriminate between the reflected light L2 and the disturbance light by using the cumulative pixel value and the peak width of the histogram. That is, the control unit 11 according to the embodiment can discriminate between the ambient light and the reflected light L2 even in an environment where the ambient light is strong, so that the distance can be measured with high accuracy.
 なお、制御部11は、累積画素値が所定の閾値TH以上のピークが所定数以上存在場合には、所定の条件を満たすピークをさらに抽出してもよい。例えば、制御部11は、累積画素値が所定の閾値TH以上のピークのうち、時間が短い順(距離が近い順)に所定数のピークを抽出してもよい。これにより、例えば、車両の緊急ブレーキ等に適用する場合、距離が近い物体90を早期に検出することができる。 Note that the control unit 11 may further extract peaks satisfying a predetermined condition when there are a predetermined number or more of peaks whose cumulative pixel value is equal to or higher than a predetermined threshold value TH. For example, the control unit 11 may extract a predetermined number of peaks in the order of shorter time (closer distance) from the peaks whose cumulative pixel value is equal to or higher than the predetermined threshold value TH. As a result, for example, when applied to an emergency brake of a vehicle, an object 90 having a short distance can be detected at an early stage.
 また、制御部11は、累積画素値が所定の閾値TH以上のピークのうち、累積画素値が高い順に所定数のピークを抽出してもよい。これにより、反射光L2としての信頼度が高いピークを抽出できるため、物体90をより高精度に検出(測距)できる。 Further, the control unit 11 may extract a predetermined number of peaks in descending order of the cumulative pixel value from the peaks whose cumulative pixel value is equal to or higher than the predetermined threshold value TH. As a result, a peak with high reliability as the reflected light L2 can be extracted, so that the object 90 can be detected (distance measurement) with higher accuracy.
 なお、制御部11は、例えば、ToFセンサ1の内部で反射したレーザ光L1の形状に基づいて反射光L2および外乱光を判別してもよい。かかる点について、図14を用いて説明する。 Note that the control unit 11 may discriminate between the reflected light L2 and the ambient light based on the shape of the laser light L1 reflected inside the ToF sensor 1, for example. This point will be described with reference to FIG.
 図14に示すように、制御部11は、まず、時間が所定時間未満の近距離に検出されたピークP1を内部反射によるレーザ光L1として抽出する。なお、例えば、実験等により事前に得たレーザ光L1の内部反射時のピーク形状に基づいて、ピークP1が内部反射によるレーザ光L1であるか否かを判定するようにしてもよい。 As shown in FIG. 14, the control unit 11 first extracts the peak P1 detected at a short distance of less than a predetermined time as the laser beam L1 due to internal reflection. For example, it may be determined whether or not the peak P1 is the laser beam L1 due to internal reflection based on the peak shape at the time of internal reflection of the laser beam L1 obtained in advance by an experiment or the like.
 そして、制御部11は、レーザ光L1の内部反射であるピークP1とピーク形状が類似するピークP2を反射光L2として検出する。具体的には、制御部11は、ピークP1のピーク形状に関する特徴量が類似するピークP2を検出する。なお、特徴量とは、例えば、ピーク値となる累積画素値やピーク幅に関する情報である。例えば、制御部11は、ヒストグラムを画像化してパターンマッチングによりピークP2を探索してもよい。 Then, the control unit 11 detects the peak P2 having a peak shape similar to that of the peak P1 which is the internal reflection of the laser beam L1 as the reflected light L2. Specifically, the control unit 11 detects peaks P2 having similar feature amounts with respect to the peak shape of peak P1. The feature amount is, for example, information about a cumulative pixel value and a peak width which are peak values. For example, the control unit 11 may image the histogram and search for the peak P2 by pattern matching.
 このように、内部反射のピークP1のピーク形状を利用することで、高精度に反射光L2を検出することができる。 In this way, by using the peak shape of the internal reflection peak P1, the reflected light L2 can be detected with high accuracy.
 なお、上述では、発光部13の画角の走査を水平方向に行う場合を一例として示したが、発光部13の画角の走査を鉛直方向に行ってもよい。かかる点について、図15および図16を用いて説明する。 Although the case where the angle of view of the light emitting unit 13 is scanned in the horizontal direction is shown as an example in the above description, the angle of view of the light emitting unit 13 may be scanned in the vertical direction. This point will be described with reference to FIGS. 15 and 16.
 図15は、変形例に係る発光部13の走査方向を示す図である。図16は、変形例に係る受光部14の走査方向を示す図である。 FIG. 15 is a diagram showing the scanning direction of the light emitting unit 13 according to the modified example. FIG. 16 is a diagram showing the scanning direction of the light receiving unit 14 according to the modified example.
 図15に示すように、発光部13のガルバノミラー135から出射されるレーザ光L1は、断面の強度スペクトルが水平方向に長い矩形の平行光である。そして、ガルバノミラー135は、例えば、制御部11からの制御に基づいて動作する駆動部134(図2参照)により、所定の回転軸を振動中心として鉛直方向に振動する。これにより、ガルバノミラー135で反射したレーザ光L1の画角SRが測距範囲ARを鉛直方向に往復走査するように、レーザ光L1が鉛直走査される。 As shown in FIG. 15, the laser beam L1 emitted from the galvano mirror 135 of the light emitting unit 13 is rectangular parallel light having a long cross-sectional intensity spectrum in the horizontal direction. Then, the galvano mirror 135 vibrates in the vertical direction with a predetermined rotation axis as the vibration center by, for example, a drive unit 134 (see FIG. 2) that operates based on the control from the control unit 11. As a result, the laser beam L1 is vertically scanned so that the angle of view SR of the laser beam L1 reflected by the galvano mirror 135 reciprocates in the ranging range AR in the vertical direction.
 ガルバノミラー135で反射したレーザ光L1は、測距範囲AR内に存在する物体90で反射し、反射光L2としてガルバノミラー135に入射する。ガルバノミラー135に入射した反射光L2は受光部14に入射する。 The laser beam L1 reflected by the galvano mirror 135 is reflected by the object 90 existing in the ranging range AR, and is incident on the galvano mirror 135 as reflected light L2. The reflected light L2 incident on the galvano mirror 135 is incident on the light receiving unit 14.
 そして、図16に示すように、受光部14は、2次元状に配列された受光素子(SPADアレイ141)のうち、発光部13の走査位置に対応する行のSPADアレイ142を読み出す。なお、図16に示すように、読み出すSPADアレイ142の領域形状は、水平方向に長い矩形状であるが、これは、レーザ光L1が水平方向に長い矩形状であることに対応させているためである。従って、SPADアレイ142を分割したSPAD領域142-1~142-4は、水平方向に分割され、水平方向に長い矩形状となる。 Then, as shown in FIG. 16, the light receiving unit 14 reads out the SPAD array 142 in the row corresponding to the scanning position of the light emitting unit 13 among the light receiving elements (SPAD array 141) arranged in two dimensions. As shown in FIG. 16, the region shape of the SPAD array 142 to be read out has a rectangular shape that is long in the horizontal direction, because the laser beam L1 has a rectangular shape that is long in the horizontal direction. Is. Therefore, the SPAD regions 142-1 to 142-4 in which the SPAD array 142 is divided are divided in the horizontal direction and have a rectangular shape that is long in the horizontal direction.
 次に、図17~図19を用いて、ToFセンサ1が実行する処理の処理手順について説明する。図17は、ToFセンサ1が実行する全体処理の処理手順を示すフローチャートである。 Next, the processing procedure of the processing executed by the ToF sensor 1 will be described with reference to FIGS. 17 to 19. FIG. 17 is a flowchart showing a processing procedure of the entire processing executed by the ToF sensor 1.
 図17に示すように、発光部13は、発光することでレーザ光L1を出射する(ステップS101)。 As shown in FIG. 17, the light emitting unit 13 emits laser light L1 by emitting light (step S101).
 つづいて、受光部14は、レーザ光L1が物体90で反射した反射光L2を受光する(ステップS102)。 Subsequently, the light receiving unit 14 receives the reflected light L2 reflected by the object 90 by the laser light L1 (step S102).
 つづいて、演算部15は、受光部14から出力される検出信号に基づいて、累積画素値のヒストグラムを生成する(ステップS103)。 Subsequently, the calculation unit 15 generates a histogram of the cumulative pixel value based on the detection signal output from the light receiving unit 14 (step S103).
 つづいて、制御部11は、生成されたヒストグラムに基づいて、物体90までの距離を算出する(ステップS104)。 Subsequently, the control unit 11 calculates the distance to the object 90 based on the generated histogram (step S104).
 つづいて、制御部11は、算出した距離をホスト80へ出力し(ステップS105)、処理を終了する。 Subsequently, the control unit 11 outputs the calculated distance to the host 80 (step S105), and ends the process.
 図18は、ToFセンサ1が実行する異常検出処理の処理手順を示すフローチャートである。 FIG. 18 is a flowchart showing a processing procedure of the abnormality detection process executed by the ToF sensor 1.
 図18に示すように、制御部11は、ヒストグラムに含まれる複数のピークのうち、累積画素値が所定の閾値以上のピークが存在するか否かを判定し(ステップS201)、累積画素値が所定の閾値以上のピークが存在しない場合(ステップS201:No)、処理を終了する。 As shown in FIG. 18, the control unit 11 determines whether or not there is a peak whose cumulative pixel value is equal to or greater than a predetermined threshold value among the plurality of peaks included in the histogram (step S201), and the cumulative pixel value is determined. When there is no peak equal to or greater than a predetermined threshold value (step S201: No), the process ends.
 制御部11は、累積画素値が所定の閾値以上のピークが存在する場合(ステップS201:Yes)、かかるピークのうち、ピーク幅Wが所定の閾値以上のピークが存在するか否かを判定する(ステップS202)。 When the cumulative pixel value has a peak of a predetermined threshold value or more (step S201: Yes), the control unit 11 determines whether or not there is a peak having a peak width W of the predetermined threshold value or more among the peaks. (Step S202).
 制御部11は、ピーク幅Wが所定の閾値以上のピークが存在する場合(ステップS202:Yes)、発光部13が異常であることを検出し(ステップS203)、処理を終了する。 When the peak width W has a peak equal to or larger than a predetermined threshold value (step S202: Yes), the control unit 11 detects that the light emitting unit 13 is abnormal (step S203), and ends the process.
 一方、制御部11は、ピーク幅Wが所定の閾値以上のピークが存在しない場合(ステップS202:No)、発光部13が正常であることを検出する(ステップS204)。 On the other hand, the control unit 11 detects that the light emitting unit 13 is normal (step S204) when there is no peak whose peak width W is equal to or greater than a predetermined threshold value (step S202: No).
 つづいて、制御部11は、累積画素値が所定の閾値以上のピークの中から反射光L2を検出し(ステップS205)、処理を終了する。 Subsequently, the control unit 11 detects the reflected light L2 from the peaks whose cumulative pixel value is equal to or higher than a predetermined threshold value (step S205), and ends the process.
 図19は、ToFセンサ1が実行する検出処理の処理手順を示すフローチャートである。 FIG. 19 is a flowchart showing a processing procedure of the detection process executed by the ToF sensor 1.
 図19に示すように、制御部11は、累積画素値が所定の閾値以上のピークが存在するか否かを判定し(ステップS301)、累積画素値が所定の閾値以上のピークが存在しない場合(ステップS301:No)、処理を終了する。 As shown in FIG. 19, the control unit 11 determines whether or not there is a peak whose cumulative pixel value is equal to or greater than a predetermined threshold value (step S301), and when there is no peak whose cumulative pixel value is equal to or greater than a predetermined threshold value. (Step S301: No), the process ends.
 制御部11は、累積画素値が所定の閾値以上のピークが存在する場合(ステップS301:Yes)、所定順に所定数のピークを抽出する(ステップS302)。例えば、制御部11は、累積画素値が高い順や、時間が短い順に所定数のピークを抽出する。 When the cumulative pixel value has peaks equal to or greater than a predetermined threshold value (step S301: Yes), the control unit 11 extracts a predetermined number of peaks in a predetermined order (step S302). For example, the control unit 11 extracts a predetermined number of peaks in descending order of cumulative pixel value or in ascending order of time.
 つづいて、制御部11は、所定順に沿ってピークが有効か否かを判定する(ステップS303)。有効か否かとは、例えば、ピークの形状やピークの時間が所定の条件を満たすか否かである。 Subsequently, the control unit 11 determines whether or not the peaks are valid in a predetermined order (step S303). Whether or not it is effective is, for example, whether or not the shape of the peak and the time of the peak satisfy a predetermined condition.
 制御部11は、ピークが有効である場合(ステップS303:Yes)、かかるピークのピーク幅Wth,Whが所定の閾値以上であるか否かを判定する(ステップS304)。なお、制御部11は、ピークが無効である場合(ステップS303:No)、もしくは、ピーク幅Wth,Whが所定の閾値未満である場合(ステップS304:No)、残ピークが存在するか否かを判定する(ステップS305)。つまり、ステップS302において抽出したピークのうち、ステップS303~ステップS304の判定処理を行っていないピークが存在するか否かを判定する。 When the peak is valid (step S303: Yes), the control unit 11 determines whether or not the peak widths Wth and Wh of the peak are equal to or greater than a predetermined threshold value (step S304). The control unit 11 determines whether or not there is a remaining peak when the peak is invalid (step S303: No) or when the peak widths Wth and Wh are less than a predetermined threshold value (step S304: No). Is determined (step S305). That is, among the peaks extracted in step S302, it is determined whether or not there is a peak for which the determination processing of steps S303 to S304 has not been performed.
 制御部11は、残ピークが存在する場合(ステップS305:Yes)、ステップS303に戻り、残ピークが存在しない場合(ステップS305:No)、処理を終了する。 The control unit 11 returns to step S303 when there is a remaining peak (step S305: Yes), and ends the process when there is no remaining peak (step S305: No).
 一方、ステップS304において、制御部11は、ピーク幅Wth,Whが所定の閾値以上である場合(ステップS304:Yes)、かかるピークを反射光L2として検出し(ステップS306)、処理を終了する。 On the other hand, in step S304, when the peak widths Wth and Wh are equal to or greater than a predetermined threshold value (step S304: Yes), the control unit 11 detects such a peak as reflected light L2 (step S306) and ends the process.
 2.応用例
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
2. Application Examples The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure includes any type of movement such as automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility, airplanes, drones, ships, robots, construction machines, agricultural machines (tractors), and the like. It may be realized as a device mounted on the body.
 図20は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システム7000の概略的な構成例を示すブロック図である。車両制御システム7000は、通信ネットワーク7010を介して接続された複数の電子制御ユニットを備える。図20に示した例では、車両制御システム7000は、駆動系制御ユニット7100、ボディ系制御ユニット7200、バッテリ制御ユニット7300、車外情報検出ユニット7400、車内情報検出ユニット7500、及び統合制御ユニット7600を備える。これらの複数の制御ユニットを接続する通信ネットワーク7010は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)又はFlexRay(登録商標)等の任意の規格に準拠した車載通信ネットワークであってよい。 FIG. 20 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technique according to the present disclosure can be applied. The vehicle control system 7000 includes a plurality of electronic control units connected via the communication network 7010. In the example shown in FIG. 20, the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an external information detection unit 7400, an in-vehicle information detection unit 7500, and an integrated control unit 7600. .. The communication network 7010 connecting these plurality of control units conforms to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network) or FlexRay (registered trademark). It may be an in-vehicle communication network.
 各制御ユニットは、各種プログラムにしたがって演算処理を行うマイクロコンピュータと、マイクロコンピュータにより実行されるプログラム又は各種演算に用いられるパラメータ等を記憶する記憶部と、各種制御対象の装置を駆動する駆動回路とを備える。各制御ユニットは、通信ネットワーク7010を介して他の制御ユニットとの間で通信を行うためのネットワークI/Fを備えるとともに、車内外の装置又はセンサ等との間で、有線通信又は無線通信により通信を行うための通信I/Fを備える。図20では、統合制御ユニット7600の機能構成として、マイクロコンピュータ7610、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660、音声画像出力部7670、車載ネットワークI/F7680及び記憶部7690が図示されている。他の制御ユニットも同様に、マイクロコンピュータ、通信I/F及び記憶部等を備える。 Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores a program executed by the microcomputer or parameters used for various arithmetics, and a drive circuit that drives various control target devices. To be equipped. Each control unit is provided with a network I / F for communicating with other control units via the communication network 7010, and is connected to devices or sensors inside or outside the vehicle by wired communication or wireless communication. A communication I / F for performing communication is provided. In FIG. 20, as the functional configuration of the integrated control unit 7600, the microcomputer 7610, general-purpose communication I / F 7620, dedicated communication I / F 7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F 7660, audio image output unit 7670, The vehicle-mounted network I / F 7680 and the storage unit 7690 are shown. Other control units also include a microcomputer, a communication I / F, a storage unit, and the like.
 駆動系制御ユニット7100は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット7100は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。駆動系制御ユニット7100は、ABS(Antilock Brake System)又はESC(Electronic Stability Control)等の制御装置としての機能を有してもよい。 The drive system control unit 7100 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 7100 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle. The drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
 駆動系制御ユニット7100には、車両状態検出部7110が接続される。車両状態検出部7110には、例えば、車体の軸回転運動の角速度を検出するジャイロセンサ、車両の加速度を検出する加速度センサ、あるいは、アクセルペダルの操作量、ブレーキペダルの操作量、ステアリングホイールの操舵角、エンジン回転数又は車輪の回転速度等を検出するためのセンサのうちの少なくとも一つが含まれる。駆動系制御ユニット7100は、車両状態検出部7110から入力される信号を用いて演算処理を行い、内燃機関、駆動用モータ、電動パワーステアリング装置又はブレーキ装置等を制御する。 The vehicle condition detection unit 7110 is connected to the drive system control unit 7100. The vehicle state detection unit 7110 may include, for example, a gyro sensor that detects the angular velocity of the axial rotation motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, an accelerator pedal operation amount, a brake pedal operation amount, or steering wheel steering. Includes at least one of the sensors for detecting angular velocity, engine speed, wheel speed, and the like. The drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110 to control an internal combustion engine, a drive motor, an electric power steering device, a braking device, and the like.
 ボディ系制御ユニット7200は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット7200は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット7200には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット7200は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 7200 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as head lamps, back lamps, brake lamps, blinkers or fog lamps. In this case, the body system control unit 7200 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches. The body system control unit 7200 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
 バッテリ制御ユニット7300は、各種プログラムにしたがって駆動用モータの電力供給源である二次電池7310を制御する。例えば、バッテリ制御ユニット7300には、二次電池7310を備えたバッテリ装置から、バッテリ温度、バッテリ出力電圧又はバッテリの残存容量等の情報が入力される。バッテリ制御ユニット7300は、これらの信号を用いて演算処理を行い、二次電池7310の温度調節制御又はバッテリ装置に備えられた冷却装置等の制御を行う。 The battery control unit 7300 controls the secondary battery 7310, which is the power supply source of the drive motor, according to various programs. For example, information such as the battery temperature, the battery output voltage, or the remaining capacity of the battery is input to the battery control unit 7300 from the battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature control of the secondary battery 7310 or the cooling device provided in the battery device.
 車外情報検出ユニット7400は、車両制御システム7000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット7400には、撮像部7410及び車外情報検出部7420のうちの少なくとも一方が接続される。撮像部7410には、ToF(Time Of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ及びその他のカメラのうちの少なくとも一つが含まれる。車外情報検出部7420には、例えば、現在の天候又は気象を検出するための環境センサ、あるいは、車両制御システム7000を搭載した車両の周囲の他の車両、障害物又は歩行者等を検出するための周囲情報検出センサのうちの少なくとも一つが含まれる。 The vehicle outside information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000. For example, at least one of the image pickup unit 7410 and the vehicle exterior information detection unit 7420 is connected to the vehicle exterior information detection unit 7400. The imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. The vehicle exterior information detection unit 7420 is used to detect, for example, the current weather or an environmental sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the ambient information detection sensors is included.
 環境センサは、例えば、雨天を検出する雨滴センサ、霧を検出する霧センサ、日照度合いを検出する日照センサ、及び降雪を検出する雪センサのうちの少なくとも一つであってよい。周囲情報検出センサは、超音波センサ、レーダ装置及びLIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)装置のうちの少なくとも一つであってよい。これらの撮像部7410及び車外情報検出部7420は、それぞれ独立したセンサないし装置として備えられてもよいし、複数のセンサないし装置が統合された装置として備えられてもよい。 The environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, and a snow sensor that detects snowfall. The ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device. The image pickup unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
 ここで、図21は、撮像部7410及び車外情報検出部7420の設置位置の例を示す図である。撮像部7910,7912,7914,7916,7918は、例えば、車両7900のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部のうちの少なくとも一つの位置に設けられる。フロントノーズに備えられる撮像部7910及び車室内のフロントガラスの上部に備えられる撮像部7918は、主として車両7900の前方の画像を取得する。サイドミラーに備えられる撮像部7912,7914は、主として車両7900の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部7916は、主として車両7900の後方の画像を取得する。車室内のフロントガラスの上部に備えられる撮像部7918は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 Here, FIG. 21 is a diagram showing an example of installation positions of the image pickup unit 7410 and the vehicle exterior information detection unit 7420. The imaging units 7910, 7912, 7914, 7916, 7918 are provided, for example, at at least one of the front nose, side mirrors, rear bumpers, back door, and upper part of the windshield of the vehicle interior of the vehicle 7900. The image pickup unit 7910 provided on the front nose and the image pickup section 7918 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 7900. The imaging units 7912 and 7914 provided in the side mirrors mainly acquire images of the side of the vehicle 7900. The image pickup unit 7916 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 7900. The imaging unit 7918 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図21には、それぞれの撮像部7910,7912,7914,7916の撮影範囲の一例が示されている。撮像範囲aは、フロントノーズに設けられた撮像部7910の撮像範囲を示し、撮像範囲b,cは、それぞれサイドミラーに設けられた撮像部7912,7914の撮像範囲を示し、撮像範囲dは、リアバンパ又はバックドアに設けられた撮像部7916の撮像範囲を示す。例えば、撮像部7910,7912,7914,7916で撮像された画像データが重ね合わせられることにより、車両7900を上方から見た俯瞰画像が得られる。 Note that FIG. 21 shows an example of the shooting range of each of the imaging units 7910, 7912, 7914, 7916. The imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose, the imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided on the side mirrors, respectively, and the imaging range d indicates the imaging range d. The imaging range of the imaging unit 7916 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 7910, 7912, 7914, 7916, a bird's-eye view image of the vehicle 7900 as viewed from above can be obtained.
 車両7900のフロント、リア、サイド、コーナ及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7922,7924,7926,7928,7930は、例えば超音波センサ又はレーダ装置であってよい。車両7900のフロントノーズ、リアバンパ、バックドア及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7926,7930は、例えばLIDAR装置であってよい。これらの車外情報検出部7920~7930は、主として先行車両、歩行者又は障害物等の検出に用いられる。 The vehicle exterior information detection units 7920, 7922, 7924, 7926, 7928, 7930 provided on the front, rear, side, corners of the vehicle 7900 and the upper part of the windshield in the vehicle interior may be, for example, an ultrasonic sensor or a radar device. The vehicle exterior information detection units 7920, 7926, 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield in the vehicle interior of the vehicle 7900 may be, for example, a lidar device. These out-of-vehicle information detection units 7920 to 7930 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, or the like.
 図20に戻って説明を続ける。車外情報検出ユニット7400は、撮像部7410に車外の画像を撮像させるとともに、撮像された画像データを受信する。また、車外情報検出ユニット7400は、接続されている車外情報検出部7420から検出情報を受信する。車外情報検出部7420が超音波センサ、レーダ装置又はLIDAR装置である場合には、車外情報検出ユニット7400は、超音波又は電磁波等を発信させるとともに、受信された反射波の情報を受信する。車外情報検出ユニット7400は、受信した情報に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、降雨、霧又は路面状況等を認識する環境認識処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、車外の物体までの距離を算出してもよい。 Return to Fig. 20 and continue the explanation. The vehicle outside information detection unit 7400 causes the image pickup unit 7410 to capture an image of the outside of the vehicle and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the connected vehicle exterior information detection unit 7420. When the vehicle exterior information detection unit 7420 is an ultrasonic sensor, a radar device, or a lidar device, the vehicle exterior information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information. The vehicle exterior information detection unit 7400 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on a road surface based on the received information. The vehicle exterior information detection unit 7400 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, etc., based on the received information. The vehicle outside information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
 また、車外情報検出ユニット7400は、受信した画像データに基づいて、人、車、障害物、標識又は路面上の文字等を認識する画像認識処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した画像データに対して歪補正又は位置合わせ等の処理を行うとともに、異なる撮像部7410により撮像された画像データを合成して、俯瞰画像又はパノラマ画像を生成してもよい。車外情報検出ユニット7400は、異なる撮像部7410により撮像された画像データを用いて、視点変換処理を行ってもよい。 Further, the vehicle exterior information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a vehicle, an obstacle, a sign, a character on the road surface, or the like based on the received image data. The vehicle exterior information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and synthesizes the image data captured by different imaging units 7410 to generate a bird's-eye view image or a panoramic image. May be good. The vehicle exterior information detection unit 7400 may perform the viewpoint conversion process using the image data captured by different imaging units 7410.
 車内情報検出ユニット7500は、車内の情報を検出する。車内情報検出ユニット7500には、例えば、運転者の状態を検出する運転者状態検出部7510が接続される。運転者状態検出部7510は、運転者を撮像するカメラ、運転者の生体情報を検出する生体センサ又は車室内の音声を集音するマイク等を含んでもよい。生体センサは、例えば、座面又はステアリングホイール等に設けられ、座席に座った搭乗者又はステアリングホイールを握る運転者の生体情報を検出する。車内情報検出ユニット7500は、運転者状態検出部7510から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。車内情報検出ユニット7500は、集音された音声信号に対してノイズキャンセリング処理等の処理を行ってもよい。 The in-vehicle information detection unit 7500 detects the in-vehicle information. For example, a driver state detection unit 7510 that detects the driver's state is connected to the in-vehicle information detection unit 7500. The driver state detection unit 7510 may include a camera that captures the driver, a biosensor that detects the driver's biological information, a microphone that collects sound in the vehicle interior, and the like. The biosensor is provided on, for example, the seat surface or the steering wheel, and detects the biometric information of the passenger sitting on the seat or the driver holding the steering wheel. The in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, and may determine whether the driver is dozing or not. You may. The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
 統合制御ユニット7600は、各種プログラムにしたがって車両制御システム7000内の動作全般を制御する。統合制御ユニット7600には、入力部7800が接続されている。入力部7800は、例えば、タッチパネル、ボタン、マイクロフォン、スイッチ又はレバー等、搭乗者によって入力操作され得る装置によって実現される。統合制御ユニット7600には、マイクロフォンにより入力される音声を音声認識することにより得たデータが入力されてもよい。入力部7800は、例えば、赤外線又はその他の電波を利用したリモートコントロール装置であってもよいし、車両制御システム7000の操作に対応した携帯電話又はPDA(Personal Digital Assistant)等の外部接続機器であってもよい。入力部7800は、例えばカメラであってもよく、その場合搭乗者はジェスチャにより情報を入力することができる。あるいは、搭乗者が装着したウェアラブル装置の動きを検出することで得られたデータが入力されてもよい。さらに、入力部7800は、例えば、上記の入力部7800を用いて搭乗者等により入力された情報に基づいて入力信号を生成し、統合制御ユニット7600に出力する入力制御回路などを含んでもよい。搭乗者等は、この入力部7800を操作することにより、車両制御システム7000に対して各種のデータを入力したり処理動作を指示したりする。 The integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs. An input unit 7800 is connected to the integrated control unit 7600. The input unit 7800 is realized by a device such as a touch panel, a button, a microphone, a switch or a lever, which can be input-operated by a passenger. Data obtained by recognizing the voice input by the microphone may be input to the integrated control unit 7600. The input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 7000. You may. The input unit 7800 may be, for example, a camera, in which case the passenger can input information by gesture. Alternatively, data obtained by detecting the movement of the wearable device worn by the passenger may be input. Further, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on the information input by the passenger or the like using the input unit 7800 and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, the passenger or the like inputs various data to the vehicle control system 7000 and instructs the processing operation.
 記憶部7690は、マイクロコンピュータにより実行される各種プログラムを記憶するROM(Read Only Memory)、及び各種パラメータ、演算結果又はセンサ値等を記憶するRAM(Random Access Memory)を含んでいてもよい。また、記憶部7690は、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等によって実現してもよい。 The storage unit 7690 may include a ROM (Read Only Memory) for storing various programs executed by the microcomputer, and a RAM (Random Access Memory) for storing various parameters, calculation results, sensor values, and the like. Further, the storage unit 7690 may be realized by a magnetic storage device such as an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, an optical magnetic storage device, or the like.
 汎用通信I/F7620は、外部環境7750に存在する様々な機器との間の通信を仲介する汎用的な通信I/Fである。汎用通信I/F7620は、GSM(登録商標)(Global System of Mobile communications)、WiMAX(登録商標)、LTE(登録商標)(Long Term Evolution)若しくはLTE-A(LTE-Advanced)などのセルラー通信プロトコル、又は無線LAN(Wi-Fi(登録商標)ともいう)、Bluetooth(登録商標)などのその他の無線通信プロトコルを実装してよい。汎用通信I/F7620は、例えば、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)へ接続してもよい。また、汎用通信I/F7620は、例えばP2P(Peer To Peer)技術を用いて、車両の近傍に存在する端末(例えば、運転者、歩行者若しくは店舗の端末、又はMTC(Machine Type Communication)端末)と接続してもよい。 The general-purpose communication I / F 7620 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 7750. General-purpose communication I / F7620 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced). , Or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi®), Bluetooth® may be implemented. The general-purpose communication I / F 7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or a business-specific network) via, for example, a base station or an access point. You may. Further, the general-purpose communication I / F7620 uses, for example, P2P (Peer To Peer) technology, and is a terminal existing in the vicinity of the vehicle (for example, a terminal of a driver, a pedestrian, or a store, or an MTC (Machine Type Communication) terminal). May be connected with.
 専用通信I/F7630は、車両における使用を目的として策定された通信プロトコルをサポートする通信I/Fである。専用通信I/F7630は、例えば、下位レイヤのIEEE802.11pと上位レイヤのIEEE1609との組合せであるWAVE(Wireless Access in Vehicle Environment)、DSRC(Dedicated Short Range Communications)、又はセルラー通信プロトコルといった標準プロトコルを実装してよい。専用通信I/F7630は、典型的には、車車間(Vehicle to Vehicle)通信、路車間(Vehicle to Infrastructure)通信、車両と家との間(Vehicle to Home)の通信及び歩車間(Vehicle to Pedestrian)通信のうちの1つ以上を含む概念であるV2X通信を遂行する。 The dedicated communication I / F 7630 is a communication I / F that supports a communication protocol formulated for use in a vehicle. The dedicated communication I / F7630 uses a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or cellular communication protocol, which is a combination of lower layer IEEE802.11p and upper layer IEEE1609. May be implemented. Dedicated communication I / F7630 typically includes vehicle-to-vehicle (Vehicle to Vehicle) communication, road-to-vehicle (Vehicle to Infrastructure) communication, vehicle-to-home (Vehicle to Home) communication, and pedestrian-to-pedestrian (Vehicle to Pedestrian) communication. ) Carry out V2X communication, a concept that includes one or more of the communications.
 測位部7640は、例えば、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して測位を実行し、車両の緯度、経度及び高度を含む位置情報を生成する。なお、測位部7640は、無線アクセスポイントとの信号の交換により現在位置を特定してもよく、又は測位機能を有する携帯電話、PHS若しくはスマートフォンといった端末から位置情報を取得してもよい。 The positioning unit 7640 receives, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), executes positioning, and executes positioning, and the latitude, longitude, and altitude of the vehicle. Generate location information including. The positioning unit 7640 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
 ビーコン受信部7650は、例えば、道路上に設置された無線局等から発信される電波あるいは電磁波を受信し、現在位置、渋滞、通行止め又は所要時間等の情報を取得する。なお、ビーコン受信部7650の機能は、上述した専用通信I/F7630に含まれてもよい。 The beacon receiving unit 7650 receives radio waves or electromagnetic waves transmitted from a radio station or the like installed on the road, and acquires information such as the current position, traffic jam, road closure, or required time. The function of the beacon receiving unit 7650 may be included in the above-mentioned dedicated communication I / F 7630.
 車内機器I/F7660は、マイクロコンピュータ7610と車内に存在する様々な車内機器7760との間の接続を仲介する通信インタフェースである。車内機器I/F7660は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)又はWUSB(Wireless USB)といった無線通信プロトコルを用いて無線接続を確立してもよい。また、車内機器I/F7660は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface、又はMHL(Mobile High-definition Link)等の有線接続を確立してもよい。車内機器7760は、例えば、搭乗者が有するモバイル機器若しくはウェアラブル機器、又は車両に搬入され若しくは取り付けられる情報機器のうちの少なくとも1つを含んでいてもよい。また、車内機器7760は、任意の目的地までの経路探索を行うナビゲーション装置を含んでいてもよい。車内機器I/F7660は、これらの車内機器7760との間で、制御信号又はデータ信号を交換する。 The in-vehicle device I / F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 existing in the vehicle. The in-vehicle device I / F7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication) or WUSB (Wireless USB). In addition, the in-vehicle device I / F7660 is connected via a connection terminal (and a cable if necessary) (not shown), USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface, or MHL (Mobile High)). A wired connection such as -definition Link) may be established. The in-vehicle device 7760 includes, for example, at least one of a mobile device or a wearable device owned by a passenger, or an information device carried in or attached to a vehicle. In addition, the in-vehicle device 7760 may include a navigation device that searches for a route to an arbitrary destination. The in-vehicle device I / F 7660 is a control signal to and from these in-vehicle devices 7760. Or exchange the data signal.
 車載ネットワークI/F7680は、マイクロコンピュータ7610と通信ネットワーク7010との間の通信を仲介するインタフェースである。車載ネットワークI/F7680は、通信ネットワーク7010によりサポートされる所定のプロトコルに則して、信号等を送受信する。 The in-vehicle network I / F7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The vehicle-mounted network I / F7680 transmits and receives signals and the like according to a predetermined protocol supported by the communication network 7010.
 統合制御ユニット7600のマイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、各種プログラムにしたがって、車両制御システム7000を制御する。例えば、マイクロコンピュータ7610は、取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット7100に対して制御指令を出力してもよい。例えば、マイクロコンピュータ7610は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行ってもよい。また、マイクロコンピュータ7610は、取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行ってもよい。 The microcomputer 7610 of the integrated control unit 7600 is via at least one of general-purpose communication I / F7620, dedicated communication I / F7630, positioning unit 7640, beacon receiving unit 7650, in-vehicle device I / F7660, and in-vehicle network I / F7680. Based on the information acquired in the above, the vehicle control system 7000 is controlled according to various programs. For example, the microcomputer 7610 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. May be good. For example, the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. Cooperative control may be performed for the purpose of. In addition, the microcomputer 7610 automatically travels autonomously without relying on the driver's operation by controlling the driving force generator, steering mechanism, braking device, etc. based on the acquired information on the surroundings of the vehicle. Coordinated control for the purpose of driving or the like may be performed.
 マイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、車両と周辺の構造物や人物等の物体との間の3次元距離情報を生成し、車両の現在位置の周辺情報を含むローカル地図情報を作成してもよい。また、マイクロコンピュータ7610は、取得される情報に基づき、車両の衝突、歩行者等の近接又は通行止めの道路への進入等の危険を予測し、警告用信号を生成してもよい。警告用信号は、例えば、警告音を発生させたり、警告ランプを点灯させたりするための信号であってよい。 The microcomputer 7610 has information acquired via at least one of a general-purpose communication I / F7620, a dedicated communication I / F7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I / F7660, and an in-vehicle network I / F7680. Based on the above, three-dimensional distance information between the vehicle and an object such as a surrounding structure or a person may be generated, and local map information including the peripheral information of the current position of the vehicle may be created. Further, the microcomputer 7610 may predict a danger such as a vehicle collision, a pedestrian or the like approaching or entering a closed road based on the acquired information, and may generate a warning signal. The warning signal may be, for example, a signal for generating a warning sound or turning on a warning lamp.
 音声画像出力部7670は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図20の例では、出力装置として、オーディオスピーカ7710、表示部7720及びインストルメントパネル7730が例示されている。表示部7720は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。表示部7720は、AR(Augmented Reality)表示機能を有していてもよい。出力装置は、これらの装置以外の、ヘッドホン、搭乗者が装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ又はランプ等の他の装置であってもよい。出力装置が表示装置の場合、表示装置は、マイクロコンピュータ7610が行った各種処理により得られた結果又は他の制御ユニットから受信された情報を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。また、出力装置が音声出力装置の場合、音声出力装置は、再生された音声データ又は音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。 The audio image output unit 7670 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly notifying information to the passengers of the vehicle or outside the vehicle. In the example of FIG. 20, an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are exemplified as output devices. The display unit 7720 may include, for example, at least one of an onboard display and a heads-up display. The display unit 7720 may have an AR (Augmented Reality) display function. The output device may be other devices other than these devices, such as headphones, wearable devices such as eyeglass-type displays worn by passengers, and projectors or lamps. When the output device is a display device, the display device displays the results obtained by various processes performed by the microcomputer 7610 or the information received from other control units in various formats such as texts, images, tables, and graphs. Display visually. When the output device is an audio output device, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, or the like into an analog signal and outputs it audibly.
 なお、図20に示した例において、通信ネットワーク7010を介して接続された少なくとも二つの制御ユニットが一つの制御ユニットとして一体化されてもよい。あるいは、個々の制御ユニットが、複数の制御ユニットにより構成されてもよい。さらに、車両制御システム7000が、図示されていない別の制御ユニットを備えてもよい。また、上記の説明において、いずれかの制御ユニットが担う機能の一部又は全部を、他の制御ユニットに持たせてもよい。つまり、通信ネットワーク7010を介して情報の送受信がされるようになっていれば、所定の演算処理が、いずれかの制御ユニットで行われるようになってもよい。同様に、いずれかの制御ユニットに接続されているセンサ又は装置が、他の制御ユニットに接続されるとともに、複数の制御ユニットが、通信ネットワーク7010を介して相互に検出情報を送受信してもよい。 In the example shown in FIG. 20, at least two control units connected via the communication network 7010 may be integrated as one control unit. Alternatively, each control unit may be composed of a plurality of control units. Further, the vehicle control system 7000 may include another control unit (not shown). Further, in the above description, the other control unit may have a part or all of the functions carried out by any of the control units. That is, as long as information is transmitted and received via the communication network 7010, predetermined arithmetic processing may be performed by any control unit. Similarly, a sensor or device connected to one of the control units may be connected to the other control unit, and the plurality of control units may send and receive detection information to and from each other via the communication network 7010. ..
 なお、図1を用いて説明した本実施形態に係るToFセンサ1の各機能を実現するためのコンピュータプログラムを、いずれかの制御ユニット等に実装することができる。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体を提供することもできる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 Note that a computer program for realizing each function of the ToF sensor 1 according to the present embodiment described with reference to FIG. 1 can be mounted on any control unit or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via, for example, a network without using a recording medium.
 以上説明した車両制御システム7000において、図1を用いて説明した本実施形態に係るToFセンサ1は、図20に示した応用例の統合制御ユニット7600に適用することができる。例えば、ToFセンサ1の制御部11、演算部15及び外部I/F19は、統合制御ユニット7600のマイクロコンピュータ7610、記憶部7690、車載ネットワークI/F7680に相当する。ただし、これに限定されず、車両制御システム7000が図1におけるホスト80に相当してもよい。 In the vehicle control system 7000 described above, the ToF sensor 1 according to the present embodiment described with reference to FIG. 1 can be applied to the integrated control unit 7600 of the application example shown in FIG. For example, the control unit 11, the calculation unit 15, and the external I / F 19 of the ToF sensor 1 correspond to the microcomputer 7610, the storage unit 7690, and the in-vehicle network I / F 7680 of the integrated control unit 7600. However, the present invention is not limited to this, and the vehicle control system 7000 may correspond to the host 80 in FIG.
 また、図1を用いて説明した本実施形態に係るToFセンサ1の少なくとも一部の構成要素は、図20に示した統合制御ユニット7600のためのモジュール(例えば、一つのダイで構成される集積回路モジュール)において実現されてもよい。あるいは、図1を用いて説明した本実施形態に係るToFセンサ1が、図20に示した車両制御システム7000の複数の制御ユニットによって実現されてもよい。 Further, at least a part of the components of the ToF sensor 1 according to the present embodiment described with reference to FIG. 1 is a module for the integrated control unit 7600 shown in FIG. 20 (for example, an integrated configuration composed of one die). It may be realized in a circuit module). Alternatively, the ToF sensor 1 according to the present embodiment described with reference to FIG. 1 may be realized by a plurality of control units of the vehicle control system 7000 shown in FIG.
 3.まとめ
 以上説明したように、本開示の一実施形態によれば、本実施形態に係る測距装置(ToFセンサ1)は、発光部13と、受光部14と、制御部11とを備える。受光部14は、発光部13の光(レーザ光L1)が反射した反射光L2を受光する複数の受光素子(SPAD画素20)が2次元状に配列される。制御部11は、複数の受光素子から所定数の受光素子毎に検出信号を読み出して測距する制御を行う。制御部11は、所定数の受光素子の検出信号に基づき演算した演算値(画素値、累積画素値)に基づいて、発光部13の異常を検出する。これにより、低コストで高精度に発光部13の異常を検出することができる。
3. 3. Summary As described above, according to one embodiment of the present disclosure, the distance measuring device (ToF sensor 1) according to the present embodiment includes a light emitting unit 13, a light receiving unit 14, and a control unit 11. In the light receiving unit 14, a plurality of light receiving elements (SPAD pixels 20) that receive the reflected light L2 reflected by the light (laser light L1) of the light emitting unit 13 are arranged two-dimensionally. The control unit 11 controls to read out detection signals from a plurality of light receiving elements for each of a predetermined number of light receiving elements and measure the distance. The control unit 11 detects an abnormality in the light emitting unit 13 based on calculated values (pixel values, cumulative pixel values) calculated based on the detection signals of a predetermined number of light receiving elements. As a result, the abnormality of the light emitting unit 13 can be detected with high accuracy at low cost.
 また、本開示の一実施形態によれば、本実施形態に係る測距装置(ToFセンサ1)は、発光部13と、受光部14と、制御部11とを備える。発光部13は、所定の測距範囲を走査する光(レーザ光L1)を出射する。受光部14は、発光部13の光が反射した反射光L2を受光する複数の受光素子(SPAD画素20)が2次元状に配列される。制御部11は、複数の受光素子のうち、発光部13の走査位置に応じた位置の受光素子から検出信号を読み出して測距する制御を行う。これにより、光を効率良く受光できる。 Further, according to one embodiment of the present disclosure, the distance measuring device (ToF sensor 1) according to the present embodiment includes a light emitting unit 13, a light receiving unit 14, and a control unit 11. The light emitting unit 13 emits light (laser beam L1) that scans a predetermined ranging range. In the light receiving unit 14, a plurality of light receiving elements (SPAD pixels 20) that receive the reflected light L2 reflected by the light emitting unit 13 are arranged two-dimensionally. The control unit 11 controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit 13 among the plurality of light receiving elements and measure the distance. As a result, light can be efficiently received.
 また、本開示の一実施形態によれば、本実施形態に係る測距装置(ToFセンサ1)は、発光部13と、受光部14と、制御部11とを備える。受光部14は、光(レーザ光L1)の入射を検出する。制御部11は、発光部13が発光してから受光部14が光の入射を検出するまでの時間に基づいて測距する制御を行う。制御部11は、受光部14から出力される検出信号に基づく演算値のヒストグラムを生成し、生成したヒストグラムに含まれる複数のピークの中から、演算値およびピーク幅それぞれが所定の閾値以上のピークを検出する。これにより、外乱光が強い環境下でも高精度に測距を行うことができる。 Further, according to one embodiment of the present disclosure, the distance measuring device (ToF sensor 1) according to the present embodiment includes a light emitting unit 13, a light receiving unit 14, and a control unit 11. The light receiving unit 14 detects the incident of light (laser beam L1). The control unit 11 controls the distance measurement based on the time from when the light emitting unit 13 emits light until the light receiving unit 14 detects the incident of light. The control unit 11 generates a histogram of the calculated value based on the detection signal output from the light receiving unit 14, and from among the plurality of peaks included in the generated histogram, the calculated value and the peak width are peaks having a predetermined threshold value or more. Is detected. As a result, distance measurement can be performed with high accuracy even in an environment with strong ambient light.
 以上、本開示の各実施形態について説明したが、本開示の技術的範囲は、上述の各実施形態そのままに限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。また、異なる実施形態及び変形例にわたる構成要素を適宜組み合わせてもよい。 Although each embodiment of the present disclosure has been described above, the technical scope of the present disclosure is not limited to each of the above-described embodiments as it is, and various changes can be made without departing from the gist of the present disclosure. be. In addition, components covering different embodiments and modifications may be combined as appropriate.
 また、本明細書に記載された各実施形態における効果はあくまで例示であって限定されるものでは無く、他の効果があってもよい。 Further, the effects in each embodiment described in the present specification are merely examples and are not limited, and other effects may be obtained.
 なお、本技術は以下のような構成も取ることができる。
(1)
 発光部と、
 前記発光部の光が反射した反射光を受光する複数の受光素子が2次元状に配列された受光部と、
 前記複数の受光素子から所定数の前記受光素子毎に検出信号を読み出して測距する制御を行う制御部と、
 を備え、
 前記制御部は、
 前記所定数の前記受光素子の前記検出信号に基づき演算した演算値に基づいて、前記発光部の異常を検出する、測距装置。
(2)
 前記制御部は、
 前記演算値のヒストグラムを生成し、当該ヒストグラムに基づいて、前記発光部の異常を検出する、前記(1)に記載の測距装置。
(3)
 前記制御部は、
 前記ヒストグラムに含まれる複数のピークの中に、前記演算値およびピーク幅が所定の条件を満たすピークが含まれる場合、前記発光部の異常を検出する、前記(2)に記載の測距装置。
(4)
 前記制御部は、
 前記演算値が所定の閾値以上、かつ、前記ピーク幅が所定の閾値以上となるピークが含まれる場合、前記発光部の異常を検出する、前記(3)に記載の測距装置。
(5)
 前記ピーク幅の前記閾値は、前記発光部の光のパルス幅に応じた値である、前記(4)に記載の測距装置。
(6)
 前記演算値は、
 前記受光部から出力された前記検出信号の数を複数の受光素子毎に集計した画素値である、前記(1)~(5)のいずれか1つに記載の測距装置。
(7)
 前記受光素子は、
 SPAD画素である、前記(1)~(6)のいずれか1つに記載の測距装置。
(8)
 前記制御部は、
 行または列毎の前記受光素子の前記演算値のうち、特定の行または列の前記演算値が所定の閾値以上である場合、前記発光部の異常を検出する、前記(1)~(7)のいずれか1つに記載の測距装置。
(9)
 発光部と、
 前記発光部の光が反射した反射光を受光する複数の受光素子が2次元状に配列された受光部と、を備える測距装置によって実行される測距方法であって、
 前記複数の受光素子から所定数の前記受光素子毎に検出信号を読み出して測距する制御を行う制御工程を含み、
 前記制御工程は、
 前記所定数の前記受光素子の前記検出信号に基づき演算した演算値に基づいて、前記発光部の異常を検出する、測距方法。
(10)
 所定の測距範囲を走査する光を出射する発光部と、
 前記発光部の光が反射した反射光を受光する複数の受光素子が2次元状に配列された受光部と、
 前記複数の受光素子のうち、前記発光部の走査位置に応じた位置の前記受光素子から検出信号を読み出して測距する制御を行う制御部と、
 を備える測距装置。
(11)
 前記制御部は、
 前記発光部の走査位置を漸変させ、前記受光部の走査位置を所定角度単位で変化させる、前記(10)に記載の測距装置。
(12)
 前記制御部は、
 前記発光部の走査開始位置と、前記受光部の走査開始位置とを揃える、前記(11)に記載の測距装置。
(13)
 前記制御部は、
 前記発光部の走査開始位置と、前記受光部の走査開始位置とを異ならせる、前記(11)または(12)に記載の測距装置。
(14)
 前記制御部は、
 前記受光部の走査位置に対応する領域を複数の領域に分割し、分割した前記複数の領域毎に走査タイミングを異ならせる、前記(11)~(13)のいずれか1つに記載の測距装置。
(15)
 前記制御部は、
 前記発光部の走査位置を所定角度単位で変化させ、前記受光部の走査位置を前記所定角度単位で変化させる、前記(10)~(14)のいずれか1つに記載の測距装置。
(16)
 所定の測距範囲を走査する光を出射する発光部と、
 前記発光部の光が反射した反射光を受光する複数の受光素子が2次元状に配列された受光部と、を備える測距装置によって実行される測距方法であって、
 前記複数の受光素子のうち、前記発光部の走査位置に応じた位置の前記受光素子から検出信号を読み出して測距する制御を行う制御工程、
 を含む測距方法。
(17)
 発光部と、
 光の入射を検出する受光部と、
 前記発光部が発光してから前記受光部が光の入射を検出するまでの時間に基づいて測距する制御を行う制御部と、を備え、
 前記制御部は、
 前記受光部から出力される検出信号に基づく演算値のヒストグラムを生成し、生成した前記ヒストグラムに含まれる複数のピークの中から、前記演算値およびピーク幅それぞれが所定の閾値以上のピークを検出する、測距装置。
(18)
 前記ピーク幅は、
 前記演算値における前記閾値のピーク幅である、前記(17)に記載の測距装置。
(19)
 前記ピーク幅は、
 ピーク値の前記演算値と、前記演算値における前記閾値との半値におけるピーク幅である、前記(17)または(18)に記載の測距装置。
(20)
 前記制御部は、
 前記ヒストグラムに含まれる複数のピークのうち、前記演算値が所定の閾値未満、または、ピーク幅が所定の閾値未満のピークを外乱光のピークとして検出する、前記(17)~(19)のいずれか1つに記載の測距装置。
(21)
 前記制御部は、
 前記ヒストグラムに含まれる複数のピークのうち、前記演算値が所定の閾値以上である複数のピークの中から、所定の条件を満たすピークを反射光の対象として抽出する、前記(17)~(20)のいずれか1つに記載の測距装置。
(22)
 前記制御部は、
 前記演算値が所定の閾値以上である複数のピークの中から、前記演算値が高い順に所定数のピークを抽出する、前記(21)に記載の測距装置。
(23)
 前記制御部は、
 前記演算値が所定の閾値以上である複数のピークの中から、飛行時間が短い順に所定数のピークを抽出する、前記(21)または(22)に記載の測距装置。
(24)
 前記制御部は、
 前記ヒストグラムに含まれる複数のピークのうち、飛行時間が所定時間内であるピークの形状と類似するピークを反射光として検出する、前記(19)~(23)のいずれか1つに記載の測距装置。
(25)
 発光部と、
 光の入射を検出する受光部と、を備える測距装置が実行する測距方法であって、
 前記発光部が発光してから前記受光部が光の入射を検出するまでの時間に基づいて測距する制御を行う制御工程、を含み、
 前記制御工程は、
 前記受光部から出力される検出信号に基づく演算値のヒストグラムを生成し、生成した前記ヒストグラムに含まれる複数のピークの中から、前記演算値およびピーク幅それぞれが所定の閾値以上のピークを検出する、測距方法。
The present technology can also have the following configurations.
(1)
Light emitting part and
A light receiving unit in which a plurality of light receiving elements that receive the reflected light reflected by the light emitting unit are arranged in a two-dimensional manner, and a light receiving unit.
A control unit that reads a detection signal from the plurality of light receiving elements for each of a predetermined number of the light receiving elements and controls distance measurement.
With
The control unit
A distance measuring device that detects an abnormality in the light emitting unit based on an calculated value calculated based on the detection signals of the predetermined number of light receiving elements.
(2)
The control unit
The distance measuring device according to (1), wherein a histogram of the calculated values is generated, and an abnormality in the light emitting unit is detected based on the histogram.
(3)
The control unit
The distance measuring device according to (2) above, wherein when the plurality of peaks included in the histogram include a peak whose calculated value and peak width satisfy a predetermined condition, an abnormality in the light emitting unit is detected.
(4)
The control unit
The distance measuring device according to (3) above, wherein when the calculated value includes a peak having a predetermined threshold value or more and the peak width has a predetermined threshold value or more, an abnormality in the light emitting unit is detected.
(5)
The distance measuring device according to (4), wherein the threshold value of the peak width is a value corresponding to the pulse width of the light of the light emitting unit.
(6)
The calculated value is
The distance measuring device according to any one of (1) to (5) above, which is a pixel value obtained by totaling the number of detection signals output from the light receiving unit for each of a plurality of light receiving elements.
(7)
The light receiving element is
The distance measuring device according to any one of (1) to (6) above, which is a SPAD pixel.
(8)
The control unit
Among the calculated values of the light receiving element for each row or column, when the calculated value of a specific row or column is equal to or higher than a predetermined threshold value, an abnormality of the light emitting unit is detected, according to (1) to (7). The distance measuring device according to any one of the above.
(9)
Light emitting part and
It is a distance measuring method executed by a distance measuring device including a light receiving unit in which a plurality of light receiving elements for receiving the reflected light reflected by the light of the light emitting unit are arranged in a two-dimensional manner.
It includes a control step of reading a detection signal from each of a predetermined number of the light receiving elements from the plurality of light receiving elements and controlling the distance measurement.
The control step is
A distance measuring method for detecting an abnormality in the light emitting unit based on an calculated value calculated based on the detection signals of the predetermined number of light receiving elements.
(10)
A light emitting unit that emits light that scans a predetermined ranging range,
A light receiving unit in which a plurality of light receiving elements that receive the reflected light reflected by the light emitting unit are arranged in a two-dimensional manner, and a light receiving unit.
Among the plurality of light receiving elements, a control unit that controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit and measure the distance.
A distance measuring device equipped with.
(11)
The control unit
The distance measuring device according to (10), wherein the scanning position of the light emitting unit is gradually changed, and the scanning position of the light receiving unit is changed in predetermined angle units.
(12)
The control unit
The distance measuring device according to (11), wherein the scanning start position of the light emitting unit and the scanning start position of the light receiving unit are aligned.
(13)
The control unit
The distance measuring device according to (11) or (12), wherein the scanning start position of the light emitting unit and the scanning start position of the light receiving unit are different from each other.
(14)
The control unit
The distance measurement according to any one of (11) to (13) above, wherein the region corresponding to the scanning position of the light receiving unit is divided into a plurality of regions, and the scanning timing is different for each of the divided regions. Device.
(15)
The control unit
The distance measuring device according to any one of (10) to (14), wherein the scanning position of the light emitting unit is changed in a predetermined angle unit, and the scanning position of the light receiving unit is changed in a predetermined angle unit.
(16)
A light emitting unit that emits light that scans a predetermined ranging range,
It is a distance measuring method executed by a distance measuring device including a light receiving unit in which a plurality of light receiving elements for receiving the reflected light reflected by the light of the light emitting unit are arranged in a two-dimensional manner.
A control step for controlling distance measurement by reading a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit among the plurality of light receiving elements.
Distance measurement method including.
(17)
Light emitting part and
A light receiving part that detects the incident of light and
A control unit that controls distance measurement based on the time from when the light emitting unit emits light until the light receiving unit detects an incident of light is provided.
The control unit
A histogram of the calculated value based on the detection signal output from the light receiving unit is generated, and a peak in which the calculated value and the peak width are each equal to or more than a predetermined threshold value is detected from a plurality of peaks included in the generated histogram. , Distance measuring device.
(18)
The peak width is
The distance measuring device according to (17), which is the peak width of the threshold value in the calculated value.
(19)
The peak width is
The distance measuring device according to (17) or (18), which is a peak width at a half value of the calculated value of the peak value and the threshold value in the calculated value.
(20)
The control unit
Any of the above (17) to (19), among the plurality of peaks included in the histogram, a peak whose calculated value is less than a predetermined threshold value or whose peak width is less than a predetermined threshold value is detected as a peak of ambient light. The distance measuring device according to one.
(21)
The control unit
Among the plurality of peaks included in the histogram, the peaks satisfying the predetermined conditions are extracted as the target of the reflected light from the plurality of peaks whose calculated values are equal to or higher than the predetermined threshold values (17) to (20). ). The distance measuring device according to any one of.
(22)
The control unit
The distance measuring device according to (21), wherein a predetermined number of peaks are extracted from a plurality of peaks whose calculated values are equal to or higher than a predetermined threshold value in descending order of the calculated values.
(23)
The control unit
The distance measuring device according to (21) or (22), wherein a predetermined number of peaks are extracted in ascending order of flight time from a plurality of peaks whose calculated values are equal to or higher than a predetermined threshold value.
(24)
The control unit
The measurement according to any one of (19) to (23) above, wherein among the plurality of peaks included in the histogram, a peak having a flight time within a predetermined time and having a shape similar to that of the peak is detected as reflected light. Distance device.
(25)
Light emitting part and
It is a distance measuring method performed by a distance measuring device including a light receiving unit for detecting the incident of light.
A control step of controlling the distance measurement based on the time from when the light emitting unit emits light until the light receiving unit detects the incident of light is included.
The control step is
A histogram of the calculated value based on the detection signal output from the light receiving unit is generated, and a peak in which the calculated value and the peak width are each equal to or more than a predetermined threshold value is detected from a plurality of peaks included in the generated histogram. , Distance measurement method.
 1   ToFセンサ(測距装置)
 11  制御部
 13  発光部
 14  受光部
 15  演算部
 20  SPAD画素
 30  マクロ画素
 80  ホスト
 90  物体
1 ToF sensor (distance measuring device)
11 Control unit 13 Light emitting unit 14 Light receiving unit 15 Calculation unit 20 SPAD pixel 30 Macro pixel 80 Host 90 Object

Claims (7)

  1.  所定の測距範囲を走査する光を出射する発光部と、
     前記発光部の光が反射した反射光を受光する複数の受光素子が2次元状に配列された受光部と、
     前記複数の受光素子のうち、前記発光部の走査位置に応じた位置の前記受光素子から検出信号を読み出して測距する制御を行う制御部と、
     を備える測距装置。
    A light emitting unit that emits light that scans a predetermined ranging range,
    A light receiving unit in which a plurality of light receiving elements that receive the reflected light reflected by the light emitting unit are arranged in a two-dimensional manner, and a light receiving unit.
    Among the plurality of light receiving elements, a control unit that controls to read a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit and measure the distance.
    A distance measuring device equipped with.
  2.  前記制御部は、
     前記発光部の走査位置を漸変させ、前記受光部の走査位置を所定角度単位で変化させる、請求項1に記載の測距装置。
    The control unit
    The distance measuring device according to claim 1, wherein the scanning position of the light emitting unit is gradually changed, and the scanning position of the light receiving unit is changed in predetermined angle units.
  3.  前記制御部は、
     前記発光部の走査開始位置と、前記受光部の走査開始位置とを揃える、請求項2に記載の測距装置。
    The control unit
    The distance measuring device according to claim 2, wherein the scanning start position of the light emitting unit and the scanning start position of the light receiving unit are aligned.
  4.  前記制御部は、
     前記発光部の走査開始位置と、前記受光部の走査開始位置とを異ならせる、請求項2に記載の測距装置。
    The control unit
    The distance measuring device according to claim 2, wherein the scanning start position of the light emitting unit and the scanning start position of the light receiving unit are different from each other.
  5.  前記制御部は、
     前記受光部の走査位置に対応する領域を複数の領域に分割し、分割した前記複数の領域毎に走査タイミングを異ならせる、請求項2に記載の測距装置。
    The control unit
    The distance measuring device according to claim 2, wherein a region corresponding to a scanning position of the light receiving unit is divided into a plurality of regions, and the scanning timing is different for each of the divided regions.
  6.  前記制御部は、
     前記発光部の走査位置を所定角度単位で変化させ、前記受光部の走査位置を前記所定角度単位で変化させる、請求項1に記載の測距装置。
    The control unit
    The distance measuring device according to claim 1, wherein the scanning position of the light emitting unit is changed in a predetermined angle unit, and the scanning position of the light receiving unit is changed in the predetermined angle unit.
  7.  所定の測距範囲を走査する光を出射する発光部と、
     前記発光部の光が反射した反射光を受光する複数の受光素子が2次元状に配列された受光部と、を備える測距装置によって実行される測距方法であって、
     前記複数の受光素子のうち、前記発光部の走査位置に応じた位置の前記受光素子から検出信号を読み出して測距する制御を行う制御工程、
     を含む測距方法。
    A light emitting unit that emits light that scans a predetermined ranging range,
    It is a distance measuring method executed by a distance measuring device including a light receiving unit in which a plurality of light receiving elements for receiving the reflected light reflected by the light of the light emitting unit are arranged in a two-dimensional manner.
    A control step for controlling distance measurement by reading a detection signal from the light receiving element at a position corresponding to the scanning position of the light emitting unit among the plurality of light receiving elements.
    Distance measurement method including.
PCT/JP2021/003780 2020-02-14 2021-02-02 Rangefinder and rangefinding method WO2021161858A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022500342A JPWO2021161858A1 (en) 2020-02-14 2021-02-02

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-023008 2020-02-14
JP2020023008 2020-02-14

Publications (1)

Publication Number Publication Date
WO2021161858A1 true WO2021161858A1 (en) 2021-08-19

Family

ID=77291835

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/003780 WO2021161858A1 (en) 2020-02-14 2021-02-02 Rangefinder and rangefinding method

Country Status (2)

Country Link
JP (1) JPWO2021161858A1 (en)
WO (1) WO2021161858A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7434115B2 (en) 2020-09-07 2024-02-20 株式会社東芝 Photodetector and distance measuring device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000056018A (en) * 1998-08-05 2000-02-25 Denso Corp Distance measuring device
JP2007183246A (en) * 2005-12-08 2007-07-19 Omron Corp Laser scanning device
JP2018537680A (en) * 2015-12-20 2018-12-20 アップル インコーポレイテッドApple Inc. Light detection distance sensor
US20190011556A1 (en) * 2017-07-05 2019-01-10 Ouster, Inc. Light ranging device with electronically scanned emitter array and synchronized sensor array

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000056018A (en) * 1998-08-05 2000-02-25 Denso Corp Distance measuring device
JP2007183246A (en) * 2005-12-08 2007-07-19 Omron Corp Laser scanning device
JP2018537680A (en) * 2015-12-20 2018-12-20 アップル インコーポレイテッドApple Inc. Light detection distance sensor
US20190011556A1 (en) * 2017-07-05 2019-01-10 Ouster, Inc. Light ranging device with electronically scanned emitter array and synchronized sensor array

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7434115B2 (en) 2020-09-07 2024-02-20 株式会社東芝 Photodetector and distance measuring device

Also Published As

Publication number Publication date
JPWO2021161858A1 (en) 2021-08-19

Similar Documents

Publication Publication Date Title
JP7246863B2 (en) Photodetector, vehicle control system and rangefinder
EP3477339A1 (en) Distance measuring device and distance measuring method
WO2020116039A1 (en) Ranging device and ranging method
JP2021128084A (en) Ranging device and ranging method
WO2020153275A1 (en) Distance measurement device, in-vehicle system, and distance measurement method
WO2020022137A1 (en) Photodetector and distance measurement apparatus
WO2020100569A1 (en) Control device, control method, and sensor control system
US20220003849A1 (en) Distance measuring device and distance measuring method
US20220057203A1 (en) Distance measurement device and distance measurement method
WO2021019939A1 (en) Light receiving device, method for controlling light receiving device, and distance-measuring device
WO2021161858A1 (en) Rangefinder and rangefinding method
WO2020153182A1 (en) Light detection device, method for driving light detection device, and ranging device
WO2021161857A1 (en) Distance measurement device and distance measurement method
WO2023281824A1 (en) Light receiving device, distance measurment device, and light receiving device control method
WO2023281825A1 (en) Light source device, distance measurement device, and distance measurement method
WO2023162734A1 (en) Distance measurement device
WO2023223928A1 (en) Distance measurement device and distance measurement system
WO2024095625A1 (en) Rangefinder and rangefinding method
WO2023218870A1 (en) Ranging device, ranging method, and recording medium having program recorded therein
WO2024095626A1 (en) Ranging device
WO2023190279A1 (en) Ranging device
WO2023234033A1 (en) Ranging device
WO2022044686A1 (en) Apd sensor and distance measurement system
WO2022244508A1 (en) Optical detection device and distance measurement system
WO2022176532A1 (en) Light receiving device, ranging device, and signal processing method for light receiving device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21753082

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022500342

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21753082

Country of ref document: EP

Kind code of ref document: A1