WO2024095626A1 - Ranging device - Google Patents

Ranging device Download PDF

Info

Publication number
WO2024095626A1
WO2024095626A1 PCT/JP2023/033884 JP2023033884W WO2024095626A1 WO 2024095626 A1 WO2024095626 A1 WO 2024095626A1 JP 2023033884 W JP2023033884 W JP 2023033884W WO 2024095626 A1 WO2024095626 A1 WO 2024095626A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
unit
data
light receiving
distance
Prior art date
Application number
PCT/JP2023/033884
Other languages
French (fr)
Japanese (ja)
Inventor
智宏 馬場
祐介 森山
広康 石井
寛浩 萩原
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2024095626A1 publication Critical patent/WO2024095626A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • This disclosure relates to a distance measuring device.
  • a distance measuring device includes a light emitting unit that emits illumination light, and a distance measuring unit that calculates data regarding the distance to a subject based on incident light obtained by irradiating the subject with the illumination light.
  • the light emitting unit has a plurality of light emitting elements arranged in an array, and a driving unit that sequentially causes the plurality of light emitting elements to emit light for each predetermined unit group within one frame period.
  • the driving unit causes the plurality of light emitting elements in the unit group to emit light at a first emission intensity and then at a second emission intensity different from the first emission intensity within one frame period.
  • multiple light-emitting elements in a unit group emit light at a first emission intensity, and then emit light at a second emission intensity different from the first emission intensity.
  • FIG. 1 is a diagram illustrating an example of functional blocks of a distance measuring device and an information processing device according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating an example of output data output from the communication unit in FIG.
  • FIG. 3 is a diagram showing an example of a planar configuration of the light-emitting element array of FIG.
  • FIG. 4 is a diagram showing an example of the planar configuration of the DOE of FIG. 1 and an example of the distribution of light emission spots generated by the DOE of FIG.
  • FIG. 5 is a diagram illustrating an example of functional blocks of the light receiving unit in FIG.
  • FIG. 6 is a diagram illustrating an example of a circuit configuration of the light-receiving pixel in FIG. FIG.
  • FIG. 7 is a diagram showing an example of light emission by light-emitting pixels and light reception by light-receiving pixels within one frame period.
  • FIG. 8 is a diagram illustrating an example of a histogram generated by the histogram generating unit of FIG.
  • FIG. 9 is a diagram for explaining the Pile-Up phenomenon.
  • FIG. 10 is a diagram for explaining the influence of reflection on the cover glass.
  • FIG. 11 is a diagram showing a modified example of the functional blocks of the distance measuring device and the information processing device in FIG.
  • FIG. 12 is a diagram illustrating an example of data stored in the storage unit of FIG.
  • FIG. 13 is a diagram showing an example of the various drive modes shown in FIG.
  • FIG. 14 is a diagram showing an example of how light is received in the light receiving element array of FIG. FIG.
  • FIG. 15 is a diagram showing an example of how light is received in the light receiving element array of FIG.
  • FIG. 16 is a diagram showing an example of how light is received in the light receiving element array of FIG.
  • FIG. 17 is a diagram showing an example of various drive modes.
  • FIG. 18 is a diagram showing an example of a procedure for switching between the various drive modes shown in FIG.
  • FIG. 19 is a diagram showing a modified example of the functional blocks of the distance measuring device and the information processing device in FIG.
  • FIG. 20 is a diagram illustrating an example of a histogram generated by the histogram generating unit of FIG.
  • FIG. 21 is a diagram illustrating an example of a procedure for calculating the reliability.
  • FIG. 22 is a diagram illustrating an example of output data output from the communication unit in FIG. 19.
  • FIG. 23 is a block diagram showing an example of a schematic configuration of a vehicle control system.
  • FIG. 24 is an explanatory diagram showing an example of the installation positions of the outside-vehi
  • Example of embodiment Example of switching light emission intensity within one frame period (FIGS. 1 to 10) 2.
  • Modifications Modification A: Setting the switching of the light emission intensity for each driving mode, Examples of setting a light receiving area for each driving mode ( Figures 11 to 16)
  • Modification B Example of switching driving modes using distance data (Figs. 17 and 18)
  • Modification C Example of calculating reliability (FIGS. 19 to 22) 3.
  • Application examples Figures 23 and 24
  • Preferred embodiment [composition] 1 illustrates an example of functional blocks of a distance measuring device 100 according to an embodiment of the present disclosure and an information processing device 200 that processes data output from the distance measuring device 100.
  • the distance measuring device 100 and the information processing device 200 are connected by an FPC 300.
  • the ranging device 100 and the information processing device 200 are electrically connected by a data bus included in the FPC 300.
  • the data bus is a signal transmission path that connects the ranging device 100 and the information processing device 200. Transmission data sent from the ranging device 100 is transmitted from the ranging device 100 to the information processing device 200 via the data bus.
  • the ranging device 100 and the information processing device 200 may be electrically connected by a control bus included in the FPC 300.
  • the control bus is another signal transmission path that connects the ranging device 100 and the information processing device 200, and is a transmission path different from the data bus. Control data sent from the information processing device 200 is transmitted from the information processing device 200 to the ranging device 100 via the control bus.
  • the information processing device 200 includes a communication unit 210, a processor 220, and a storage unit 230.
  • the communication unit 210 is a communication interface configured to be able to communicate with the distance measuring device 100 via the FPC 300.
  • the processor 220 is configured with one or more processors configured with an arithmetic circuit such as an MPU (Micro Processing Unit), various processing circuits, etc.
  • the processor 220 performs various processes, such as a process related to the recording control of data in the storage unit 230 and a process of executing any application software.
  • the processor 220 may control the functions of the distance measuring device 100, for example, by transmitting control information to the distance measuring device 100.
  • the processor 220 may also control the driving mode of the distance measuring device 100, for example, by transmitting driving mode setting information to the distance measuring device 100.
  • Packet structure Next, a description will be given of an example of a packet structure used for transmitting transmission data from the distance measuring device 100 to the information processing device 200.
  • the transmission data generated in the distance measuring device 100 is divided into partial data for each row, for example, and the partial data for each row is transmitted using one or more packets.
  • FIG. 2 shows an example of transmission data transmitted from the distance measuring device 100 to the information processing device 200.
  • FIG. 2 shows an example of transmission data used when transmitting data according to the MIPI CSI-2 standard or the MIPI CSI-3 standard.
  • a packet used to transmit transmission data contains a packet header PH, payload data, and a packet footer PF, arranged in that order.
  • the payload data (hereinafter also simply referred to as "payload") contains pixel data of partial data in row units.
  • the packet header PH is, for example, the packet header of the PayloadData of a LongPacket.
  • a LongPacket refers to a packet that is placed between the packet head PH and the packet footer PF.
  • the PayloadData of a LongPacket refers to the main data transmitted between devices.
  • the transmission data is composed of a data frame, for example, as shown in Figure 2.
  • a data frame usually has a header area, a packet area, and a footer area.
  • the header area contains EmbeddedData.
  • EmbeddedData refers to additional information that can be embedded in the header or footer of a data frame. In this case, EmbeddedData contains the frame number, channel number, and emission intensity information.
  • the packet area contains the payload data of the LongPacket for each line, and further contains a packet header PH and a packet footer PF at the position sandwiching the payload data of the LongPacket.
  • the packet area also contains, for example, histogram data described below, or distance data calculated based on a histogram.
  • the histogram data or distance data calculated based on a histogram corresponds to a specific example of "data relating to the distance to the subject" in this disclosure.
  • the payload data of the LongPacket for each line contains pixel data for one line in the histogram data described below, or distance data calculated based on a histogram.
  • the distance measuring device 100 includes a light emitting unit 110, a light receiving unit 120, a cover glass 130, a histogram generating unit 140, a communication unit 150, and a controller 160.
  • the light receiving unit 120, the histogram generating unit 140, the communication unit 150, and the controller 160 are provided, for example, in one sensor chip.
  • the light emitting unit 110 has a drive circuit 111, a light emitting element array 112, and a DOE 113.
  • the light-emitting element array 112 has a configuration in which a plurality of light-emitting pixels Pa are arranged in an array, as shown in FIG. 3, for example.
  • Each light-emitting pixel Pa includes a light-emitting element.
  • each light-emitting element outputs light of a predetermined wavelength (also called irradiation light) at a predetermined pulse repetition period (also called PRI period).
  • a predetermined wavelength also called irradiation light
  • PRI period a predetermined pulse repetition period
  • VCSEL vertical cavity surface-emitting laser
  • various light sources capable of emitting light of a predetermined wavelength may be used for each light-emitting element.
  • a number of light-emitting pixels Pa are divided into N channels Ch1 to ChN.
  • Each channel Ch corresponds to a specific unit group that is driven simultaneously by the drive circuit 111.
  • the drive circuit 111 drives each pixel Pa of the light-emitting element array 112.
  • the drive circuit 111 may be configured to be capable of driving each pixel Pa independently, or may be configured to be capable of driving each pixel Pa in a predetermined unit group.
  • the drive circuit 111 may include, for example, a plurality of drive circuits provided in a one-to-one correspondence with each of the plurality of pixels Pa.
  • the drive circuit 111 may include, for example, a plurality of drive circuits provided one for each channel Ch.
  • the drive circuit 111 is configured to sequentially drive the plurality of pixels Pa for each channel Ch. In other words, the drive circuit 111 is configured to sequentially cause the plurality of pixels Pa (light-emitting elements) to emit light for each channel Ch.
  • the DOE 113 is a diffractive optical element.
  • the DOE 113 splits the light emitted from each light-emitting pixel Pa into multiple beams by using a diffraction grating that utilizes the diffraction phenomenon of light. For example, as shown in FIG. 4, the DOE 113 splits each light beam incident from the light-emitting element array 112 into three beams. This allows the DOE 113 to generate three times the number of light-emitting spots Sp as the number of light-emitting pixels Pa included in the light-emitting element array 112. Note that the number of beams split by the DOE 113 is not limited to three, and may be two, or four or more.
  • the light generated by the DOE 113 is emitted to the outside through the cover glass 130.
  • the light emitted to the outside through the cover glass 130 becomes irradiation light La and reaches the subject.
  • the light emitter 110 emits irradiation light La to the subject.
  • the irradiation light La is reflected by the subject, and the reflected light (incident light Lb) generated by the reflection on the subject is incident on the light receiver 120 through the cover glass 130.
  • the DOE 113 may be omitted if necessary.
  • the light receiving unit 120 receives reflected light (incident light Lb) obtained by irradiating the subject with the irradiation light La.
  • the light receiving unit 120 has a drive circuit 121, a light receiving element array 122, and a TDC (Time-to-Digital Converter) 123.
  • the light receiving element array 122 has a configuration in which a plurality of light receiving pixels Pb are arranged in an array, as shown in FIG. 5, for example.
  • Each light receiving pixel Pb has a light receiving element (SPAD element 21), a readout circuit 22, a latch 26, and an output buffer 27, as shown in FIG. 6, for example.
  • the SPAD element 21 operates in Geiger mode, and generates an avalanche current when a photon is incident while a negative bias voltage VSPAD (e.g., about -10V to -20V) equal to or greater than the breakdown voltage is applied between the anode and cathode.
  • VSPAD negative bias voltage
  • the readout circuit 22 is composed of three transistors 23 to 25.
  • Transistor 23 is, for example, a quench resistor and may be composed of a P-type MOS (Metal-Oxide-Semiconductor) transistor.
  • Transistors 24 and 25 are, for example, selection transistors for selecting the light-receiving pixel Pb.
  • Transistor 24 may be, for example, a P-type MOS transistor, and transistor 25 may be, for example, an N-type MOS transistor.
  • a preset bias voltage (also called a quench voltage) is applied to the gate of transistor 23, which is a quench resistor, to make transistor 23 act as a quench resistor.
  • an output from latch 26, which will be described later, is applied to the gates of transistors 24 and 25.
  • the latch 26 is composed of a latch circuit 261, a NAND circuit 262, and a buffer 263.
  • the latch circuit 261 holds the selection/non-selection information of the pixel.
  • the NAND circuit 262 calculates a negative logical product of the selection/non-selection information of the pixel held in the latch circuit 261 and the column selection signal YE. The result of this negative logical product is applied to the gates of the selection transistors 24 and 25 in the readout circuit 22, and is also input to the control terminal OE of the output buffer 27 via the buffer 263.
  • the pixel selection signal PXSEL 0.
  • the pixel selection signal PXSEL becomes 1.
  • the cathode potential VS of the SPAD element 21 is discharged to the 0V side, and the cathode potential VS is recharged by the current source of the quench resistor 23.
  • a detection signal PXOUT whose rising edge coincides with the timing of the detection of this photon is output from the output buffer 27.
  • the output buffer 27 corresponds to the output circuit 126 shown in FIG. 5.
  • the TDC 123 converts the timing of the rising edge of the detection signal PXOUT into a digital value. For example, the TDC 123 measures the flight time of a photon from when the light emitting unit 110 emits light until the reflected light (incident light Lb) is detected by the light receiving pixel Pb, and outputs the measured time as a digital output value TDCOUT.
  • the TDC 123 has a TDC circuit for each light receiving pixel Pb, and measures the elapsed time from the output timing of the irradiated light La to the detection timing of the incident light Lb for each light receiving pixel Pb with high resolution (for example, a period of about 100 ps (picoseconds)), and outputs the time obtained thereby as a digital output value TDCOUT.
  • the driving circuit 121 controls the light receiving element array 122 and the TDC 123.
  • the driving circuit 121 includes, for example, a timing control circuit 124 and a driving circuit 125, as shown in FIG. 5.
  • the timing control circuit 124 includes a timing generator that generates various timing signals, and controls the drive circuit 125 and output circuit 126 based on the various timing signals generated by the timing generator.
  • the drive circuit 125 includes a shift register, an address decoder, and the like, and drives each light-receiving pixel Pb of the light-receiving element array 122.
  • the drive circuit 125 includes at least a circuit that applies a bias voltage (quench voltage) (described later) to each selected light-receiving pixel Pb in the light-receiving element array 122.
  • the detection signal PXOUT output from each light receiving pixel Pb selected by the drive circuit 125 is input to the output circuit 126 through each output signal line LS.
  • the output circuit 126 outputs the detection signal PXOUT input from each light receiving pixel Pb selected by the drive circuit 125 to the TDC 123.
  • sampling period a period for measuring the flight time of photons from when the light-emitting unit 110 emits light until the reflected light (incident light Lb) is detected by the light-receiving pixel Pb.
  • the sampling period is set to a period shorter than the PRI period of the light-emitting unit 110. For example, by shortening the sampling period, it becomes possible to estimate or calculate the flight time of photons emitted from the light-emitting unit 110 and reflected by the subject with a higher time resolution. This means that by increasing the sampling frequency, it becomes possible to estimate or calculate the distance to the subject with a higher ranging resolution.
  • the sampling period is 1 ns (nanosecond). In that case, one sampling period corresponds to 15 cm (centimeter). This indicates that the distance measurement resolution when the sampling frequency is 1 GHz is 15 cm. Furthermore, if the sampling frequency is doubled to 2 GHz, the sampling period becomes 0.5 ns (nanosecond), and one sampling period corresponds to 7.5 cm (centimeter). This indicates that the distance measurement resolution can be halved when the sampling frequency is doubled. In this way, by increasing the sampling frequency and shortening the sampling period, it is possible to estimate or calculate the distance to the subject with greater accuracy.
  • the histogram generating unit 140 generates a histogram for each light receiving pixel Pb based on the output value TDCOUT of each light receiving pixel Pb output from the TDC 123.
  • the histogram relates to, for example, the time from when the light emitting pixel Pa is driven until the incidence of light on one or more light receiving pixels Pb is detected.
  • the histogram generating unit 140 generates a histogram for each light receiving pixel Pb by, for example, adding the output value TDCOUT output from the TDC 123 for each light receiving pixel Pb to a count value (accumulated value) stored in a BIN corresponding to the sampling period.
  • the histogram generating unit 140 may calculate distance data based on the generated histogram as necessary.
  • the communication unit 150 is a communication interface configured to be able to communicate with the information processing device 200 via the FPC 300.
  • the communication unit 150 transmits the histogram generated for each light receiving pixel Pb in the histogram generation unit 140, or distance data calculated based on the generated histogram, as transmission data to the information processing device 200 via the FPC 300.
  • the communication unit 150 receives drive mode setting information from the information processing device 200 via the FPC 300.
  • the communication unit 150 outputs the received drive mode setting information to the controller 160.
  • the controller 160 controls the light-emitting unit 110 and the light-receiving unit 120.
  • the controller 160 controls the light emission of the light-emitting unit 110 and the light reception of the light-receiving unit 120.
  • the controller 160 controls the light-emitting unit 110 and the light-receiving unit 120 based on control information (e.g., drive mode setting information) received from the information processing device 200 via the FPC 300, for example.
  • control information e.g., drive mode setting information
  • FIG. 7 shows an example of the light emission of the light-emitting pixel Pa and the light reception of the light-receiving pixel Pb within one frame period.
  • the driving unit 111 in accordance with the control of the controller 160, causes a plurality of light-emitting pixels Pa (light-emitting elements) in a unit group (channel ch) to emit light at a first emission intensity and then at a second emission intensity different from the first emission intensity during one frame period.
  • the driving unit 111 further causes a plurality of light-emitting pixels Pa (light-emitting elements) to emit light sequentially for each of channels ch1 to chN during one frame period, in accordance with the control of the controller 160.
  • the first emission intensity is, for example, a relatively weaker emission intensity than the second emission intensity.
  • the second emission intensity is, for example, a relatively stronger emission intensity than the first emission intensity.
  • the driving unit 111 sets the first emission intensity and the second emission intensity according to control information (for example, drive mode setting information) received from the information processing device 200.
  • the driving unit 111 switches the light emission intensity as described above within one frame period
  • the light emitting unit 110 for example, as shown in FIG. 7, within one frame period, multiple light emitting pixels Pa (light emitting elements) in a unit group (channel ch) emit light at a first light emission intensity and then emit light at a second light emission intensity.
  • multiple light emitting pixels Pa (light emitting elements) emit light sequentially for each of channels ch1 to chN.
  • the drive unit 121 in accordance with the control of the controller 160, causes the multiple light receiving pixels Pb corresponding to the emitting channel ch to sequentially perform exposure, readout, and standby in synchronization with the light emission (weak light emission) of the channel ch in the light emitting unit 110. Furthermore, the drive unit 121, in accordance with the control of the controller 160, causes the multiple light receiving pixels Pb corresponding to the emitting channel ch to sequentially perform exposure, readout, and standby in synchronization with the light emission (strong light emission) of the channel ch in the light emitting unit 110.
  • the driving unit 111 further causes the multiple light receiving pixels Pb (light receiving elements) to sequentially perform exposure, readout, and standby for each of the multiple light receiving pixels Pb corresponding to the channel ch that emits light (weak light emission) during one frame period in accordance with the control of the controller 160.
  • the driving unit 111 further causes the multiple light receiving pixels Pb (light receiving elements) to sequentially perform exposure, readout, and standby for each of the multiple light receiving pixels Pb corresponding to the channel ch that emits light (strong light emission) during one frame period in accordance with the control of the controller 160.
  • the driving unit 121 causes multiple light receiving pixels Pb (light receiving elements) corresponding to a channel ch that emits light (weak light emission) to perform exposure, readout, and standby, the multiple light receiving pixels Pb (light receiving elements) corresponding to a unit group (channel ch) in the light receiving unit 120, for example, as shown in FIG. 7, perform exposure, readout, and standby.
  • the driving unit 121 causes multiple light receiving pixels Pb corresponding to a channel ch that emits light (strong light emission) to perform exposure, readout, and standby, the multiple light receiving pixels Pb (light receiving elements) corresponding to a unit group (channel ch) in the light receiving unit 120, for example, perform exposure, readout, and standby.
  • the histogram generating unit 140 generates a histogram (first data related to the distance to the subject) for each light receiving pixel Pb based on the output value TDCOUT of the multiple light receiving pixels Pb (light receiving elements) corresponding to the channel ch that emits light (weak light emission) output from the TDC 123.
  • the histogram generating unit 140 generates a histogram (first data related to the distance to the subject) for each light receiving pixel Pb by, for example, adding the output value TDCOUT output from the TDC 123 for each light receiving pixel Pb (light receiving element) corresponding to the channel ch that emits light (weak light emission) to a count value (accumulated value) stored in a BIN corresponding to the sampling period.
  • the histogram generating unit 140 may calculate distance data based on the generated histogram (first data related to the distance to the subject) as necessary.
  • the histogram generating unit 140 generates a histogram (second data related to the distance to the subject) for each light receiving pixel Pb based on the output values TDCOUT of the multiple light receiving pixels Pb (light receiving elements) corresponding to the channel ch that emits light (strong light emission) output from the TDC 123.
  • the histogram generating unit 140 generates a histogram (second data related to the distance to the subject) for each light receiving pixel Pb by, for example, adding the output values TDCOUT output from the TDC 123 for each light receiving pixel Pb (light receiving elements) corresponding to the channel ch that emits light (strong light emission) to a count value (accumulated value) stored in a BIN corresponding to the sampling period.
  • the histogram generating unit 140 may calculate distance data based on the generated histogram (second data related to the distance to the subject) as necessary.
  • the communication unit 150 outputs at least one of the first data and the second data calculated by the histogram generation unit 140 and data on the light emission intensity of the light-emitting unit 110 as transmission data in a predetermined format. If necessary, the communication unit 150 may output distance data corresponding to at least one of the first data and the second data and data on the light emission intensity of the light-emitting unit 110 as transmission data in a predetermined format.
  • Fig. 8 shows an example of a histogram generated by the histogram generating unit 140.
  • the histogram obtained with weak light emission has a peak value at a BIN number smaller than the BIN number corresponding to a distance of 1 m, as shown in Fig. 8(A), for example.
  • the peak value is greater than a predetermined threshold value at this time, the distance corresponding to the peak value corresponds to the distance to the subject.
  • Figs. 8(A) and 8(B) show 1 m as an example of the boundary between close and not close distances, but the boundary is not limited to 1 m.
  • the histogram obtained with strong light emission has a peak value at a BIN number smaller than the BIN number corresponding to a distance of 1 m, as shown in FIG. 8B, for example.
  • the so-called pile-up phenomenon occurs, so even if the peak value at this time is greater than a predetermined threshold, the distance corresponding to the peak value is shorter than the actual distance to the subject. In this pile-up phenomenon, for example, as shown in FIG.
  • the second BIN number is smaller than the first BIN number by a predetermined magnitude ( ⁇ X). Therefore, when the subject is relatively close, it is desirable to derive the distance to the subject based on the histogram obtained with weak light emission, not the histogram obtained with strong light emission.
  • the reacting light receiving element When incident light Lb enters the light receiving element array 122 and a light receiving element reacts, the reacting light receiving element then enters a certain period of time during which it is in principle unable to detect light (dead time). For example, when strong intensity illumination light La is emitted from the light emitting pixel Pa, the light reflected by the cover glass 130 also becomes strong and enters the light receiving element. At this time, there is a high probability that the light receiving element will react and then enter the dead time, so for example, as shown in Figure 10 (A), the light receiving element will not be able to detect light reflected from a nearby subject.
  • the histogram obtained with weak light emission does not have a peak value exceeding the threshold, as shown in FIG. 8(C), for example. This is because the intensity of the reflected light is insufficient.
  • the histogram obtained with strong light emission has a peak value exceeding the threshold beyond the boundary (1 m) between close and not close distances, as shown in FIG. 8(D), for example. Note that although 1 m is shown as an example of the boundary between close and not close distances in FIG. 8(C) and FIG. 8(D), the boundary is not limited to 1 m.
  • multiple light-emitting pixels Pa (light-emitting elements) in a unit group (channel ch) emit light at a first emission intensity, and then emit light at a second emission intensity different from the first emission intensity.
  • a close distance for example, data can be obtained that has little measurement error due to the Pile-Up phenomenon and is almost free of the effects of dead time due to reflection on the cover glass.
  • data can be obtained that allows the distance to the subject to be calculated. Therefore, the distance to the subject can be calculated with greater accuracy than when light is emitted at a single intensity.
  • a histogram (first data related to the distance to the subject) is generated for each light receiving pixel Pb based on the output value TDCOUT of the multiple light receiving pixels Pb (light receiving elements) corresponding to the channel ch that emits light (weak light emission) output from the TDC 123. Furthermore, a histogram (second data related to the distance to the subject) is generated for each light receiving pixel Pb based on the output value TDCOUT of the multiple light receiving pixels Pb (light receiving elements) corresponding to the channel ch that emits light (strong light emission) output from the TDC 123.
  • first data data that has little measurement error due to the Pile-Up phenomenon and is almost free of the influence of dead time due to reflection on the cover glass.
  • second data data that can be used to calculate the distance to the subject is obtained. Therefore, the distance to the subject can be calculated with higher accuracy than when light is emitted at a single intensity.
  • At least one of the first data and the second data calculated by the histogram generating unit 140 and data on the light emission intensity of the light emitting unit 110 are output as output data (transmission data) in a predetermined format. This allows the information processing device 200 to accurately calculate the distance to the subject based on the transmission data.
  • the distance measuring device 100 may include a storage unit 170, for example, as shown in Fig. 11.
  • the storage unit 170 may be configured, for example, with a random access memory (RAM), a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage unit 170 may store a data set in which MP setting data corresponding to a drive mode is defined, for example, as shown in Fig. 12. This data set will be described in detail later.
  • the various drive modes illustrated in FIG. 12 have, for example, the conditions and features shown in FIG. 13. As shown in FIG. 13, when the drive mode is Dual1 or Dual2, the controller 160 instructs the drive unit 111 to switch the light emission intensity. Also, as shown in FIG. 13, when the drive mode is Single1 or Single2, the controller 160 instructs the drive unit 111 to drive at a single light emission intensity.
  • the information processing device 200 transmits drive mode setting information to the distance measuring device 100.
  • the distance measuring device 100 controller 160
  • the distance measuring device 100 receives the drive mode setting information from the information processing device 200 (processor 220)
  • it instructs the drive unit 111 to set the light emission intensity according to the received drive mode setting information.
  • the distance measuring device 100 instructs the drive unit 111 to make the multiple light emitting pixels Pa (light emitting elements) in the unit group (channel ch) emit light at a first light emission intensity and then at a second light emission intensity during one frame period. This makes it possible to select a drive mode according to the distance to the subject, and therefore to accurately calculate the distance to the subject.
  • the MP setting data refers to, for example, address data of the light receiving pixels Pb to be activated according to the drive mode.
  • This address data refers to address data of some of the light receiving pixels Pb (light receiving elements) among the light receiving pixels Pb (light receiving elements) included in the light receiving element array 122.
  • Address data that is different from the address data assigned to a mode that is assigned to a subject that is relatively close is assigned to a mode that is assigned to a subject that is relatively far away (for example, Single1).
  • the controller 160 When the drive mode is Dual1, the controller 160 reads out the MP setting data data1 corresponding to Dual1 from the data set in the memory unit 170, and instructs the drive unit 121 to activate the multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data data1.
  • the drive unit 121 activates the multiple light receiving pixels Pb that correspond to the MP setting data data1 among the multiple light receiving pixels Pb, thereby causing the multiple active light receiving pixels Pb to output light receiving data.
  • the controller 160 When the drive mode is Single1, the controller 160 reads out the MP setting data data2 corresponding to Single1 from the data set in the memory unit 170, and instructs the drive unit 121 to activate the multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data data2.
  • the drive unit 121 activates the multiple light receiving pixels Pb that correspond to the MP setting data data2 among the multiple light receiving pixels Pb, thereby causing the multiple active light receiving pixels Pb to output light receiving data.
  • the controller 160 When the drive mode is Single2, the controller 160 reads out the MP setting data data3 corresponding to Single2 from the data set in the memory unit 170, and instructs the drive unit 121 to activate the multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data data3.
  • the drive unit 121 activates the multiple light receiving pixels Pb that correspond to the MP setting data data3 among the multiple light receiving pixels Pb, thereby causing the multiple active light receiving pixels Pb to output light receiving data.
  • the controller 160 reads out the MP setting data data4 corresponding to Dual2 from the data set in the memory unit 170, and instructs the drive unit 121 to activate the multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data data4.
  • the drive unit 121 activates the multiple light receiving pixels Pb that correspond to the MP setting data data4 among the multiple light receiving pixels Pb, thereby causing the multiple active light receiving pixels Pb to output light receiving data.
  • the region (active pixel region MP) that includes the multiple active light receiving pixels Pb (light receiving elements) is fixed regardless of the distance to the subject, for example, as shown in FIG. 14.
  • the position of the region (illumination spot Rb) onto which the incident light Lb is irradiated will be displaced depending on the distance to the subject due to the effects of parallax, for example, as shown in FIG. 14, and the illumination spot Rb may fall outside the active pixel region MP. If the illumination spot Rb falls outside the active pixel region MP, the subject cannot be detected.
  • the controller 160 when the controller 160 receives drive mode setting information from the information processing device 200 (processor 220), it reads out MP setting data corresponding to the received drive mode setting information from the data set in the storage unit 170.
  • the controller 160 instructs the drive unit 121 to activate a plurality of light receiving pixels Pb (light receiving elements) that correspond to the read MP setting data, among the plurality of light receiving pixels Pb (light receiving elements) included in the light receiving element array 122. This allows the controller 160 to output light receiving data from the plurality of active light receiving pixels Pb (light receiving elements).
  • the driving unit 121 sets an active pixel region MP according to the driving mode in accordance with control from the controller 160. This allows selective activation of multiple light receiving pixels Pb included in the area where the incident light Lb is incident in the light receiving element array 122. As a result, power consumption in the light receiving element array 122 can be kept low compared to when all light receiving pixels Pb included in the light receiving element array 122 are activated. Furthermore, regardless of the distance to the subject, at least a portion of the irradiation spot Rb can be made to overlap at least a portion of the active pixel region MP. As a result, the subject can be reliably detected regardless of the distance to the subject while keeping power consumption in the light receiving element array 122 low. Therefore, the distance to the subject can be derived with high accuracy.
  • the controller 160 when the controller 160 receives drive mode setting information from the information processing device 200 (processor 220), it reads MP setting data corresponding to the received drive mode setting information from the data set in the storage unit 170, and instructs the drive unit 121 to activate multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data. This allows the controller 160 to determine multiple light receiving pixels Pb (light receiving elements) to be activated according to the drive mode obtained from the information processing device 200 (processor 220).
  • the controller 160 sets the active pixel area MP, for example, at the position shown in FIG. 16(A). If the drive mode acquired from the information processing device 200 (processor 220) is a mode (e.g., Dual2) that assumes that the subject is located at a relatively medium distance, the controller 160 sets the active pixel area MP, for example, at the position shown in FIG. 16(B).
  • a mode e.g., Dual1, Single1
  • the controller 160 sets the active pixel area MP, for example, at the position shown in FIG. 16(A).
  • the controller 160 sets the active pixel area MP, for example, at the position shown in FIG. 16(C).
  • the power consumption in the light receiving element array 122 can be kept low and the subject can be reliably detected regardless of the distance to the subject. Furthermore, even if the drive mode is dynamically changed under control of the information processing device 200 (processor 220), the power consumption in the light receiving element array 122 can be kept low and the subject can be reliably detected regardless of the drive mode. Therefore, the distance to the subject can be derived with high accuracy.
  • the histogram generating unit 140 may calculate distance data based on the generated histogram.
  • the controller 160 may dynamically set the drive mode based on the distance data calculated by the histogram generating unit 140.
  • FIG. 17 shows an example of various drive modes. Three drive modes are shown in FIG. 17: basic mode, close distance mode, and medium distance mode.
  • the controller 160 dynamically sets, for example, one of the three drive modes shown in FIG. 17 based on the distance data calculated by the histogram generating unit 140.
  • FIG. 18 shows an example of a procedure for dynamically setting one of the three drive modes shown in FIG. 17.
  • the controller 160 controls the drive units 111 and 121 in the basic mode shown in FIG. 17 (step S101).
  • the controller 160 acquires the distance data D calculated by the histogram generating unit 140 (step S102). If the distance data D is shorter than 3 m and shorter than 1 m (step S103: Y, step S104: Y), the controller 160 controls the drive units 111 and 121 in the medium distance mode shown in FIG. 17 (step S105). If the distance data D is shorter than 3 m and longer than 1 m (step S103: Y, step S104: N), the controller 160 controls the drive units 111 and 121 in the short distance mode shown in FIG. 17 (step S106).
  • the drive mode is dynamically set based on the distance data calculated by the histogram generating unit 140. This makes it possible to derive the distance to the subject with high accuracy.
  • the distance measuring device 100 may further include a reliability calculation unit 180 that calculates a reliability C (score value) as shown in Fig. 19.
  • the reliability C is useful when the subject is at a close distance.
  • the distance corresponding to the peak of the histogram (first data) obtained by weak light emission when the subject is close is just slightly larger than the boundary between close and not close (for example, 1 m).
  • the peak of the histogram (second data) obtained by strong light emission when the subject is close is just slightly smaller than the boundary between close and not close (for example, 1 m) due to the pile-up phenomenon.
  • the distance measuring device 100 performs driving that switches between strong and weak light emission within one frame period, and acquires a histogram (first data, second data) of each light receiving pixel Pb (light receiving element) based on the light received at that time (strong and weak light emission distance measuring, step S201).
  • the communication unit 150 outputs at least one of the first data and the second data, data on the light emission intensity of the light-emitting unit 110, and the reliability C as transmission data in a predetermined format. If necessary, the communication unit 150 may output distance data corresponding to at least one of the first data and the second data, data on the light emission intensity of the light-emitting unit 110, and the reliability C as transmission data in a predetermined format.
  • transmission data including the reliability C (score value) is output. This makes it possible to derive the distance to the subject from at least one of the first data and the second data. As a result, it is possible to avoid a situation in which the distance to the subject cannot be derived due to the peak positions in the first data and the second data.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be realized as a device mounted on any type of moving object, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, a robot, a construction machine, or an agricultural machine (tractor).
  • the vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside vehicle information detection unit 7400, an inside vehicle information detection unit 7500, and an integrated control unit 7600.
  • the communication network 7010 connecting these multiple control units may be, for example, an in-vehicle communication network conforming to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark).
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores the programs executed by the microcomputer or parameters used in various calculations, and a drive circuit that drives various devices to be controlled.
  • Each control unit includes a network I/F for communicating with other control units via a communication network 7010, and a communication I/F for communicating with devices or sensors inside and outside the vehicle by wired or wireless communication.
  • the functional configuration of the integrated control unit 7600 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I/F 7660, an audio/image output unit 7670, an in-vehicle network I/F 7680, and a storage unit 7690.
  • Other control units also include a microcomputer, a communication I/F, a storage unit, and the like.
  • the drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 functions as a control device for a drive force generating device for generating a drive force for the vehicle, such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle of the vehicle, and a braking device for generating a braking force for the vehicle.
  • the drive system control unit 7100 may also function as a control device such as an ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • the drive system control unit 7100 is connected to a vehicle state detection unit 7110.
  • the vehicle state detection unit 7110 includes at least one of the following: a gyro sensor that detects the angular velocity of the axial rotational motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, or a sensor for detecting the amount of operation of the accelerator pedal, the amount of operation of the brake pedal, the steering angle of the steering wheel, the engine speed, or the rotation speed of the wheels.
  • the drive system control unit 7100 performs arithmetic processing using the signal input from the vehicle state detection unit 7110, and controls the internal combustion engine, the drive motor, the electric power steering device, the brake device, etc.
  • the body system control unit 7200 controls the operation of various devices installed in the vehicle body according to various programs.
  • the body system control unit 7200 functions as a control device for a keyless entry system, a smart key system, a power window device, or various lamps such as headlamps, tail lamps, brake lamps, turn signals, and fog lamps.
  • radio waves or signals from various switches transmitted from a portable device that replaces a key can be input to the body system control unit 7200.
  • the body system control unit 7200 accepts the input of these radio waves or signals and controls the vehicle's door lock device, power window device, lamps, etc.
  • the battery control unit 7300 controls the secondary battery 7310, which is the power supply source for the drive motor, according to various programs. For example, information such as the battery temperature, battery output voltage, or remaining capacity of the battery is input to the battery control unit 7300 from a battery device equipped with the secondary battery 7310. The battery control unit 7300 performs calculations using these signals, and controls the temperature regulation of the secondary battery 7310 or a cooling device or the like equipped in the battery device.
  • the outside vehicle information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000.
  • the imaging unit 7410 and the outside vehicle information detection unit 7420 is connected to the outside vehicle information detection unit 7400.
  • the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the outside vehicle information detection unit 7420 includes at least one of an environmental sensor for detecting the current weather or climate, or a surrounding information detection sensor for detecting other vehicles, obstacles, pedestrians, etc., around the vehicle equipped with the vehicle control system 7000.
  • the environmental sensor may be, for example, at least one of a raindrop sensor that detects rain, a fog sensor that detects fog, a sunshine sensor that detects the level of sunlight, and a snow sensor that detects snowfall.
  • the surrounding information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the imaging unit 7410 and the outside vehicle information detection unit 7420 may each be provided as an independent sensor or device, or may be provided as a device in which multiple sensors or devices are integrated.
  • the imaging units 7910, 7912, 7914, 7916, and 7918 are provided, for example, at least one of the front nose, side mirrors, rear bumper, back door, and upper part of the windshield inside the vehicle cabin of the vehicle 7900.
  • the imaging unit 7910 provided on the front nose and the imaging unit 7918 provided on the upper part of the windshield inside the vehicle cabin mainly acquire images of the front of the vehicle 7900.
  • the imaging units 7912 and 7914 provided on the side mirrors mainly acquire images of the sides of the vehicle 7900.
  • the imaging unit 7916 provided on the rear bumper or back door mainly acquires images of the rear of the vehicle 7900.
  • the imaging unit 7918 provided on the upper part of the windshield inside the vehicle cabin is mainly used to detect leading vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, etc.
  • FIG. 24 shows an example of the imaging ranges of the imaging units 7910, 7912, 7914, and 7916.
  • Imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose
  • imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided on the side mirrors, respectively
  • imaging range d indicates the imaging range of the imaging unit 7916 provided on the rear bumper or back door.
  • an overhead image of the vehicle 7900 viewed from above is obtained by superimposing the image data captured by the imaging units 7910, 7912, 7914, and 7916.
  • External information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, corners, and upper part of the windshield inside the vehicle 7900 may be, for example, ultrasonic sensors or radar devices.
  • External information detection units 7920, 7926, and 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield inside the vehicle 7900 may be, for example, LIDAR devices. These external information detection units 7920 to 7930 are mainly used to detect preceding vehicles, pedestrians, obstacles, etc.
  • the outside-vehicle information detection unit 7400 causes the imaging unit 7410 to capture an image outside the vehicle and receives the captured image data.
  • the outside-vehicle information detection unit 7400 also receives detection information from the connected outside-vehicle information detection unit 7420. If the outside-vehicle information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the outside-vehicle information detection unit 7400 transmits ultrasonic waves or electromagnetic waves and receives information on the received reflected waves.
  • the outside-vehicle information detection unit 7400 may perform object detection processing or distance detection processing for people, cars, obstacles, signs, or characters on the road surface based on the received information.
  • the outside-vehicle information detection unit 7400 may perform environmental recognition processing for recognizing rainfall, fog, road surface conditions, etc. based on the received information.
  • the outside-vehicle information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
  • the outside vehicle information detection unit 7400 may also perform image recognition processing or distance detection processing to recognize people, cars, obstacles, signs, or characters on the road surface based on the received image data.
  • the outside vehicle information detection unit 7400 may perform processing such as distortion correction or alignment on the received image data, and may also generate an overhead image or a panoramic image by synthesizing image data captured by different imaging units 7410.
  • the outside vehicle information detection unit 7400 may also perform viewpoint conversion processing using image data captured by different imaging units 7410.
  • the in-vehicle information detection unit 7500 detects information inside the vehicle.
  • a driver state detection unit 7510 that detects the state of the driver is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera that captures an image of the driver, a biosensor that detects the driver's biometric information, or a microphone that collects sound inside the vehicle.
  • the biosensor is provided, for example, on the seat or steering wheel, and detects the biometric information of a passenger sitting in the seat or a driver gripping the steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, or may determine whether the driver is dozing off.
  • the in-vehicle information detection unit 7500 may perform processing such as noise canceling on the collected sound signal.
  • the integrated control unit 7600 controls the overall operation of the vehicle control system 7000 according to various programs.
  • the input unit 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device that can be operated by the passenger, such as a touch panel, a button, a microphone, a switch, or a lever. Data obtained by voice recognition of a voice input by a microphone may be input to the integrated control unit 7600.
  • the input unit 7800 may be, for example, a remote control device using infrared or other radio waves, or an externally connected device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 7000.
  • PDA Personal Digital Assistant
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information by gestures. Alternatively, data obtained by detecting the movement of a wearable device worn by the passenger may be input. Furthermore, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on information input by the passenger using the above-mentioned input unit 7800 and outputs the input signal to the integrated control unit 7600. Passengers and others can operate the input unit 7800 to input various data and instruct processing operations to the vehicle control system 7000.
  • the memory unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, etc.
  • the memory unit 7690 may also be realized by a magnetic memory device such as a HDD (Hard Disc Drive), a semiconductor memory device, an optical memory device, or a magneto-optical memory device, etc.
  • the general-purpose communication I/F 7620 is a general-purpose communication I/F that mediates communication between various devices present in the external environment 7750.
  • the general-purpose communication I/F 7620 may implement cellular communication protocols such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced), or other wireless communication protocols such as wireless LAN (also called Wi-Fi (registered trademark)) and Bluetooth (registered trademark).
  • GSM Global System of Mobile communications
  • WiMAX registered trademark
  • LTE registered trademark
  • LTE-A Long Term Evolution
  • Bluetooth registered trademark
  • the general-purpose communication I/F 7620 may connect to devices (e.g., application servers or control servers) present on an external network (e.g., the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point.
  • the general-purpose communication I/F 7620 may connect to a terminal located near the vehicle (e.g., a driver's, pedestrian's, or store's terminal, or an MTC (Machine Type Communication) terminal) using, for example, P2P (Peer To Peer) technology.
  • P2P Peer To Peer
  • the dedicated communication I/F 7630 is a communication I/F that supports a communication protocol developed for use in a vehicle.
  • the dedicated communication I/F 7630 may implement a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or a cellular communication protocol, which is a combination of the lower layer IEEE 802.11p and the higher layer IEEE 1609.
  • the dedicated communication I/F 7630 typically performs V2X communication, which is a concept that includes one or more of vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication.
  • the positioning unit 7640 performs positioning by receiving, for example, GNSS signals from GNSS (Global Navigation Satellite System) satellites (for example, GPS signals from GPS (Global Positioning System) satellites), and generates position information including the latitude, longitude, and altitude of the vehicle.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • the positioning unit 7640 may determine the current position by exchanging signals with a wireless access point, or may obtain position information from a terminal such as a mobile phone, PHS, or smartphone that has a positioning function.
  • the beacon receiver 7650 receives, for example, radio waves or electromagnetic waves transmitted from radio stations installed on the road, and acquires information such as the current location, congestion, road closures, and travel time.
  • the functions of the beacon receiver 7650 may be included in the dedicated communication I/F 7630 described above.
  • the in-vehicle device I/F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle.
  • the in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • the in-vehicle device I/F 7660 may also establish a wired connection such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), or MHL (Mobile High-definition Link) via a connection terminal (and a cable, if necessary) not shown.
  • USB Universal Serial Bus
  • HDMI High-Definition Multimedia Interface
  • MHL Mobile High-definition Link
  • the in-vehicle device 7760 may include, for example, at least one of a mobile device or wearable device owned by a passenger, or an information device carried into or attached to the vehicle.
  • the in-vehicle device 7760 may also include a navigation device that searches for a route to an arbitrary destination.
  • the in-vehicle device I/F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
  • the in-vehicle network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the in-vehicle network I/F 7680 transmits and receives signals in accordance with a specific protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 according to various programs based on information acquired through at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I/F 7660, and the in-vehicle network I/F 7680.
  • the microcomputer 7610 may calculate the control target value of the driving force generating device, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and output a control command to the drive system control unit 7100.
  • the microcomputer 7610 may perform cooperative control for the purpose of realizing the functions of an ADAS (Advanced Driver Assistance System), including vehicle collision avoidance or impact mitigation, following driving based on the distance between vehicles, vehicle speed maintenance driving, vehicle collision warning, vehicle lane departure warning, etc.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 7610 may control the driving force generating device, steering mechanism, braking device, etc. based on the acquired information about the surroundings of the vehicle, thereby performing cooperative control for the purpose of automatic driving, which allows the vehicle to travel autonomously without relying on the driver's operation.
  • the microcomputer 7610 may generate three-dimensional distance information between the vehicle and objects such as surrounding structures and people based on information acquired via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle equipment I/F 7660, and the in-vehicle network I/F 7680, and may create local map information including information about the surroundings of the vehicle's current position.
  • the microcomputer 7610 may also predict dangers such as vehicle collisions, the approach of pedestrians, or entry into closed roads based on the acquired information, and generate warning signals.
  • the warning signals may be, for example, signals for generating warning sounds or turning on warning lights.
  • the audio/image output unit 7670 transmits at least one of audio and image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle of information.
  • an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are illustrated as output devices.
  • the display unit 7720 may include, for example, at least one of an on-board display and a head-up display.
  • the display unit 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices such as headphones, a wearable device such as a glasses-type display worn by the passenger, a projector, or a lamp, in addition to these devices.
  • the output device When the output device is a display device, the display device visually displays the results obtained by various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, graphs, etc.
  • the output device is an audio output device, the audio output device converts an audio signal consisting of reproduced audio data or acoustic data into an analog signal and audibly outputs it.
  • At least two control units connected via the communication network 7010 may be integrated into one control unit.
  • each control unit may be composed of multiple control units.
  • the vehicle control system 7000 may include another control unit not shown.
  • some or all of the functions performed by any control unit may be provided by another control unit.
  • a predetermined calculation process may be performed by any control unit.
  • a sensor or device connected to any control unit may be connected to another control unit, and multiple control units may transmit and receive detection information to each other via the communication network 7010.
  • computer programs for implementing the functions of the distance measuring device 100 and the information processing device 200 described above can be implemented in any of the control units, etc.
  • a computer-readable recording medium on which such a computer program is stored can also be provided.
  • the recording medium can be, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, etc.
  • the above computer program can be distributed, for example, via a network, without using a recording medium.
  • the above-mentioned distance measuring device 100 and information processing device 200 can be used, for example, as a light source steering unit of a LIDAR as an environmental sensor.
  • image recognition in the imaging unit can be performed by an optical computing unit using the above-mentioned distance measuring device 100 and information processing device 200.
  • the components of the distance measuring device 100 and the information processing device 200 described above may be realized in a module (e.g., an integrated circuit module configured on a single die) for the integrated control unit 7600 shown in FIG. 23.
  • the distance measuring device 100 and the information processing device 200 described above may be realized by multiple control units of the vehicle control system 7000 shown in FIG. 23.
  • the present disclosure can have the following configuration.
  • a light emitting unit that emits irradiation light; a distance measuring unit that calculates data on a distance to the subject based on incident light obtained by irradiating the subject with the irradiation light, the light-emitting section includes a plurality of light-emitting elements arranged in an array, and a first driving section that sequentially causes the plurality of light-emitting elements to emit light for each predetermined unit group within one frame period; the first driving section causes the plurality of light-emitting elements in the unit group to emit light at a first emission intensity and then at a second emission intensity different from the first emission intensity during the one frame period.
  • the distance measuring device described in (1) wherein the distance measuring unit calculates first data regarding the distance to the subject obtained when the multiple light-emitting elements are made to emit light at the first emission intensity, and second data regarding the distance to the subject obtained when the multiple light-emitting elements are made to emit light at the second emission intensity.
  • An acquisition unit that acquires a control signal for setting a drive mode, The distance measuring device according to any one of (1) to (3), wherein the first driving unit sets the first emission intensity and the second emission intensity according to the driving mode acquired by the acquisition unit.
  • the distance measuring device described in (4) wherein when the drive mode acquired by the acquisition unit is a mode assuming that the subject is located at a relatively long distance, the first drive unit causes the multiple light-emitting elements in the unit group to emit light at the first emission intensity and then at the second emission intensity during the one frame period.
  • a light receiving unit that receives the incident light is further provided, the light receiving unit has a plurality of light receiving elements arranged in an array, and a second drive unit that activates at least a portion of the plurality of light receiving elements to output light receiving data from the activated light receiving elements,
  • the distance measuring device according to (4) or (5), wherein the second driving unit determines which of the plurality of light receiving elements is to be activated in accordance with the driving mode acquired by the acquisition unit.
  • a light receiving unit that receives the incident light is further provided, the light receiving unit has a plurality of light receiving elements arranged in an array, and a second drive unit that activates at least a portion of the plurality of light receiving elements to output light receiving data from the activated light receiving elements,
  • the light receiving element is a single photon avalanche diode (SPAD)
  • the distance measuring unit generates a histogram for each pixel that includes one or more of the light receiving elements, regarding the time from when the light emitting element is driven to when light is detected as being incident on the one or more light receiving elements, and uses the generated histogram or distance information calculated based on the generated histogram as data regarding the distance to the subject.
  • the distance measurement unit calculates a reliability of the histogram based on the generated histogram
  • the ranging device described in (7) further includes an output unit that outputs at least one of the first data and the second data calculated by the ranging unit, data on the light emission intensity of the light emission unit, and the reliability as output data in a predetermined format.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A ranging device according to one embodiment of the present disclosure is provided with a light-emitting unit for emitting irradiation light, and a ranging unit for calculating data pertaining to the distance to a subject on the basis of incident light obtained by irradiating the subject with the irradiation light. The light-emitting unit has a plurality of light-emitting elements arranged in an array, and a drive unit for causing the plurality of light-emitting elements to sequentially emit light in prescribed unit groups within one frame period. Within one frame period, the drive unit causes a plurality of light-emitting elements in a unit group to emit light at a first light emission intensity and then emit light at a second light emission intensity different from the first light emission intensity.

Description

測距装置Distance measuring device
 本開示は、測距装置に関する。 This disclosure relates to a distance measuring device.
 近年、SPAD(Single Photon Avalanche Diode)を用いた光検出装置が測距センサなどの分野で注目されている(例えば特許文献1参照)。 In recent years, optical detection devices using SPADs (Single Photon Avalanche Diodes) have been attracting attention in fields such as distance measurement sensors (see, for example, Patent Document 1).
特開2021-120630号公報JP 2021-120630 A
 ところで、測距センサの分野では、測距精度の更なる向上が望まれている。従って、測距精度の更なる向上を図ることの可能な測距装置を提供することが望ましい。 In the field of distance measuring sensors, there is a demand for further improvements in distance measuring accuracy. Therefore, it is desirable to provide a distance measuring device that can achieve further improvements in distance measuring accuracy.
 本開示の一側面に係る測距装置は、照射光を出射する発光部と、照射光の被写体への照射によって得られる入射光に基づいて被写体までの距離に関するデータを算出する測距部とを備えている。発光部は、アレイ状に配置された複数の発光素子と、複数の発光素子を1フレーム期間内に所定の単位グループごとに順次発光させる駆動部とを有している。この駆動部は、1フレーム期間内において、単位グループ内の複数の発光素子を第1の発光強度で発光させた後第1の発光強度とは異なる第2の発光強度で発光させる。 A distance measuring device according to one aspect of the present disclosure includes a light emitting unit that emits illumination light, and a distance measuring unit that calculates data regarding the distance to a subject based on incident light obtained by irradiating the subject with the illumination light. The light emitting unit has a plurality of light emitting elements arranged in an array, and a driving unit that sequentially causes the plurality of light emitting elements to emit light for each predetermined unit group within one frame period. The driving unit causes the plurality of light emitting elements in the unit group to emit light at a first emission intensity and then at a second emission intensity different from the first emission intensity within one frame period.
 本開示の一側面に係る測距装置では、1フレーム期間内において、単位グループ内の複数の発光素子が第1の発光強度で発光した後、第1の発光強度とは異なる第2の発光強度で発光する。これにより、例えば、被写体が近距離にある場合であっても、Pile-Up現象に起因する測定誤差が少なく、カバーガラスでの反射によるデッドタイムの影響がほとんど無いデータが得られる。また、例えば、被写体が遠距離にある場合であっても、被写体までの距離を算出することの可能なデータが得られる。 In a distance measuring device according to one aspect of the present disclosure, during one frame period, multiple light-emitting elements in a unit group emit light at a first emission intensity, and then emit light at a second emission intensity different from the first emission intensity. As a result, even if the subject is at a close distance, data can be obtained that has little measurement error due to the pile-up phenomenon and is almost completely free of the effects of dead time due to reflection on the cover glass. Furthermore, even if the subject is at a long distance, data can be obtained that allows the distance to the subject to be calculated.
図1は、本開示の一実施の形態に係る測距装置および情報処理装置の機能ブロック例を表す図である。FIG. 1 is a diagram illustrating an example of functional blocks of a distance measuring device and an information processing device according to an embodiment of the present disclosure. 図2は、図1の通信部から出力される出力データの一例を表す図である。FIG. 2 is a diagram illustrating an example of output data output from the communication unit in FIG. 図3は、図1の発光素子アレイの平面構成例を表す図である。FIG. 3 is a diagram showing an example of a planar configuration of the light-emitting element array of FIG. 図4は、図1のDOEの平面構成例と、図1のDOEによって生じる発光スポットの分布例とを表す図である。FIG. 4 is a diagram showing an example of the planar configuration of the DOE of FIG. 1 and an example of the distribution of light emission spots generated by the DOE of FIG. 図5は、図1の受光部の機能ブロック例を表す図である。FIG. 5 is a diagram illustrating an example of functional blocks of the light receiving unit in FIG. 図6は、図5の受光画素の回路構成例を表す図である。FIG. 6 is a diagram illustrating an example of a circuit configuration of the light-receiving pixel in FIG. 図7は、1フレーム期間内における発光画素の発光と受光画素の受光の一例を表す図である。FIG. 7 is a diagram showing an example of light emission by light-emitting pixels and light reception by light-receiving pixels within one frame period. 図8は、図1のヒストグラム生成部によって生成されるヒストグラムの一例を表す図である。FIG. 8 is a diagram illustrating an example of a histogram generated by the histogram generating unit of FIG. 図9は、Pile-Up現象について説明するための図である。FIG. 9 is a diagram for explaining the Pile-Up phenomenon. 図10は、カバーガラスでの反射の影響について説明するための図である。FIG. 10 is a diagram for explaining the influence of reflection on the cover glass. 図11は、図1の測距装置および情報処理装置の機能ブロックの一変形例を表す図である。FIG. 11 is a diagram showing a modified example of the functional blocks of the distance measuring device and the information processing device in FIG. 図12は、図11の記憶部に格納されるデータの一例を表す図である。FIG. 12 is a diagram illustrating an example of data stored in the storage unit of FIG. 図13は、図11に記載の各種駆動モードの一例を表す図である。FIG. 13 is a diagram showing an example of the various drive modes shown in FIG. 図14は、図11の受光素子アレイにおける受光の様子の一例を表す図である。FIG. 14 is a diagram showing an example of how light is received in the light receiving element array of FIG. 図15は、図11の受光素子アレイにおける受光の様子の一例を表す図である。FIG. 15 is a diagram showing an example of how light is received in the light receiving element array of FIG. 図16は、図11の受光素子アレイにおける受光の様子の一例を表す図である。FIG. 16 is a diagram showing an example of how light is received in the light receiving element array of FIG. 図17は、各種駆動モードの一例を表す図である。FIG. 17 is a diagram showing an example of various drive modes. 図18は、図17の各種駆動モードの切り替え手順の一例を表す図である。FIG. 18 is a diagram showing an example of a procedure for switching between the various drive modes shown in FIG. 図19は、図1の測距装置および情報処理装置の機能ブロックの一変形例を表す図である。FIG. 19 is a diagram showing a modified example of the functional blocks of the distance measuring device and the information processing device in FIG. 図20は、図19のヒストグラム生成部によって生成されるヒストグラムの一例を表す図である。FIG. 20 is a diagram illustrating an example of a histogram generated by the histogram generating unit of FIG. 図21は、信頼度の計算手順の一例を表す図である。FIG. 21 is a diagram illustrating an example of a procedure for calculating the reliability. 図22は、図19の通信部から出力される出力データの一例を表す図である。FIG. 22 is a diagram illustrating an example of output data output from the communication unit in FIG. 19. In FIG. 図23は、車両制御システムの概略的な構成の一例を示すブロック図である。FIG. 23 is a block diagram showing an example of a schematic configuration of a vehicle control system. 図24は、車外情報検出部及び撮像部の設置位置の一例を示す説明図である。FIG. 24 is an explanatory diagram showing an example of the installation positions of the outside-vehicle information detection unit and the imaging unit.
以下、本開示を実施するための形態について、図面を参照して詳細に説明する。なお、説明は以下の順序で行う。
 
1.実施の形態
  1フレーム期間内に発光強度を切り替える例(図1~図10)
2.変形例
  変形例A:駆動モードごとに発光強度の切り替えを設定したり、
     駆動モードごとに受光領域を設定したりする例(図11~図16)
  変形例B:距離データを用いて駆動モードを切り替える例
                          (図17,図18)
  変形例C:信頼度を計算する例(図19~図22)
3.適用例(図23,図24)
 
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. The description will be made in the following order.

1. Example of embodiment: Example of switching light emission intensity within one frame period (FIGS. 1 to 10)
2. Modifications Modification A: Setting the switching of the light emission intensity for each driving mode,
Examples of setting a light receiving area for each driving mode (Figures 11 to 16)
Modification B: Example of switching driving modes using distance data (Figs. 17 and 18)
Modification C: Example of calculating reliability (FIGS. 19 to 22)
3. Application examples (Figures 23 and 24)
<1.実施の形態>
[構成]
 図1は、本開示の一実施の形態に係る測距装置100と、測距装置100から出力されたデータを処理する情報処理装置200との機能ブロック例を表したものである。測距装置100と情報処理装置200とは、FPC300で接続される。
1. Preferred embodiment
[composition]
1 illustrates an example of functional blocks of a distance measuring device 100 according to an embodiment of the present disclosure and an information processing device 200 that processes data output from the distance measuring device 100. The distance measuring device 100 and the information processing device 200 are connected by an FPC 300.
 測距装置100と情報処理装置200とは、FPC300に含まれるデータバスにより電気的に接続される。データバスは、測距装置100と情報処理装置200とを接続する、一の信号の伝送路である。測距装置100から送信される送信データは、測距装置100から情報処理装置200へとデータバスを介して伝送される。測距装置100と情報処理装置200とは、FPC300に含まれる制御バスにより電気的に接続されてもよい。制御バスは、測距装置100と情報処理装置200とを接続する、他の一の信号の伝送路であり、データバスとは異なる伝送路である。情報処理装置200から送信される制御データが、情報処理装置200から測距装置100へと制御バスを介して伝送される。 The ranging device 100 and the information processing device 200 are electrically connected by a data bus included in the FPC 300. The data bus is a signal transmission path that connects the ranging device 100 and the information processing device 200. Transmission data sent from the ranging device 100 is transmitted from the ranging device 100 to the information processing device 200 via the data bus. The ranging device 100 and the information processing device 200 may be electrically connected by a control bus included in the FPC 300. The control bus is another signal transmission path that connects the ranging device 100 and the information processing device 200, and is a transmission path different from the data bus. Control data sent from the information processing device 200 is transmitted from the information processing device 200 to the ranging device 100 via the control bus.
(情報処理装置200)
 情報処理装置200は、通信部210と、プロセッサ220と、記憶部230とを備えている。通信部210は、FPC300を介して測距装置100と通信可能に構成された通信インターフェースである。プロセッサ220は、例えば、MPU(Micro Processing Unit)などの演算回路で構成される1または2以上のプロセッサや、各種処理回路などで構成される。プロセッサ220は、例えば、記憶部230へのデータの記録制御に係る処理や、任意のアプリケーションソフトウェアを実行する処理など、様々な処理を行う。プロセッサ220は、例えば、測距装置100に対して制御情報を送信することによって、測距装置100における機能を制御してもよい。プロセッサ220は、例えば、測距装置100に対して駆動モード設定情報を送信することによって、測距装置100における駆動モードを制御することも可能である。
(Information processing device 200)
The information processing device 200 includes a communication unit 210, a processor 220, and a storage unit 230. The communication unit 210 is a communication interface configured to be able to communicate with the distance measuring device 100 via the FPC 300. The processor 220 is configured with one or more processors configured with an arithmetic circuit such as an MPU (Micro Processing Unit), various processing circuits, etc. The processor 220 performs various processes, such as a process related to the recording control of data in the storage unit 230 and a process of executing any application software. The processor 220 may control the functions of the distance measuring device 100, for example, by transmitting control information to the distance measuring device 100. The processor 220 may also control the driving mode of the distance measuring device 100, for example, by transmitting driving mode setting information to the distance measuring device 100.
(パケットの構造)
 次に、測距装置100から情報処理装置200への送信データの送信に利用されるパケットの構造の一例について説明する。測距装置100において生成された送信データが、例えば、行単位の部分データに分割され、行ごとの部分データが1以上のパケットを利用して送信される。
(Packet structure)
Next, a description will be given of an example of a packet structure used for transmitting transmission data from the distance measuring device 100 to the information processing device 200. The transmission data generated in the distance measuring device 100 is divided into partial data for each row, for example, and the partial data for each row is transmitted using one or more packets.
 図2は、測距装置100から情報処理装置200に送信される送信データの一例を表したものである。図2には、MIPI CSI-2規格もしくはMIPI CSI-3規格でデータを送信する際に利用される送信データの一例が示される。 FIG. 2 shows an example of transmission data transmitted from the distance measuring device 100 to the information processing device 200. FIG. 2 shows an example of transmission data used when transmitting data according to the MIPI CSI-2 standard or the MIPI CSI-3 standard.
 図2に示すように、送信データの送信に利用されるパケットには、パケットヘッダPHと、ペイロードデータ(Payload Data)と、パケットフッタPFとがこの順に配列されて含まれる。ペイロードデータ(以下、単に「ペイロード」とも称する)には、行単位の部分データのピクセルデータが含まれる。 As shown in FIG. 2, a packet used to transmit transmission data contains a packet header PH, payload data, and a packet footer PF, arranged in that order. The payload data (hereinafter also simply referred to as "payload") contains pixel data of partial data in row units.
 パケットヘッダPHは、例えば、LongPacketのPayloadDataのパケットヘッダである。LongPacketとは、パケットヘッドPHとパケットフッタPFとの間に配置されるパケットを指す。LongPacketのPayloadDataとは、装置間で伝送される主要なデータを指す。 The packet header PH is, for example, the packet header of the PayloadData of a LongPacket. A LongPacket refers to a packet that is placed between the packet head PH and the packet footer PF. The PayloadData of a LongPacket refers to the main data transmitted between devices.
 送信データは、例えば、図2に示したようなデータフレームによって構成される。データフレームは、通常、ヘッダ領域、パケット領域、およびフッタ領域を有している。データフレームにおいて、ヘッダ領域には、EmbeddedDataが含まれる。EmbeddedDataとは、データフレームのヘッダもしくはフッタに埋め込むことの可能な追加情報を指す。このとき、EmbeddedDataには、フレーム番号と、チャネル番号と、発光強度情報が含まれる。 The transmission data is composed of a data frame, for example, as shown in Figure 2. A data frame usually has a header area, a packet area, and a footer area. In a data frame, the header area contains EmbeddedData. EmbeddedData refers to additional information that can be embedded in the header or footer of a data frame. In this case, EmbeddedData contains the frame number, channel number, and emission intensity information.
 また、図2に示したように、データフレームにおいて、パケット領域には、1ラインごとに、LongPacketのPayloadDataが含まれ、さらに、LongPacketのPayloadDataを挟み込む位置にパケットヘッダPHおよびパケットフッタPFが含まれる。また、パケット領域には、例えば、後述のヒストグラムデータ、もしくはヒストグラムに基づいて算出した距離データが含まれる。ヒストグラムデータ、もしくはヒストグラムに基づいて算出した距離データは、本開示の「被写体までの距離に関するデータ」の一具体例に相当する。各ラインのLongPacketのPayloadDataには、後述のヒストグラムデータ、もしくはヒストグラムに基づいて算出した距離データにおける1ライン分のピクセルデータが含まれる。 Also, as shown in FIG. 2, in the data frame, the packet area contains the payload data of the LongPacket for each line, and further contains a packet header PH and a packet footer PF at the position sandwiching the payload data of the LongPacket. The packet area also contains, for example, histogram data described below, or distance data calculated based on a histogram. The histogram data or distance data calculated based on a histogram corresponds to a specific example of "data relating to the distance to the subject" in this disclosure. The payload data of the LongPacket for each line contains pixel data for one line in the histogram data described below, or distance data calculated based on a histogram.
(測距装置100)
 測距装置100は、発光部110と、受光部120と、カバーガラス130と、ヒストグラム生成部140と、通信部150と、コントローラ160とを備える。測距装置100において、受光部120、ヒストグラム生成部140、通信部150およびコントローラ160が、例えば、1つのセンサチップ内に設けられている。
(Range measuring device 100)
The distance measuring device 100 includes a light emitting unit 110, a light receiving unit 120, a cover glass 130, a histogram generating unit 140, a communication unit 150, and a controller 160. In the distance measuring device 100, the light receiving unit 120, the histogram generating unit 140, the communication unit 150, and the controller 160 are provided, for example, in one sensor chip.
 発光部110は、駆動回路111と、発光素子アレイ112と、DOE113とを有する。 The light emitting unit 110 has a drive circuit 111, a light emitting element array 112, and a DOE 113.
 発光素子アレイ112は、例えば、図3に示したように、複数の発光画素Paがアレイ状に配列された構成を有する。各発光画素Paは、発光素子を含んで構成される。各発光素子は、駆動回路111による駆動に従い、所定のパルス繰返し周期(PRI周期ともいう)で所定波長の光(照射光ともいう)を出力する。各発光素子には、例えば、垂直共振器面発光レーザ(Vertical  Cavity  Surface  Emitting  LASER:VCSEL)を用いることができる。ただし、各発光素子には、所定波長の光を出射することが可能な種々の光源が用いられてよい。 The light-emitting element array 112 has a configuration in which a plurality of light-emitting pixels Pa are arranged in an array, as shown in FIG. 3, for example. Each light-emitting pixel Pa includes a light-emitting element. Driven by the drive circuit 111, each light-emitting element outputs light of a predetermined wavelength (also called irradiation light) at a predetermined pulse repetition period (also called PRI period). For each light-emitting element, for example, a vertical cavity surface-emitting laser (VCSEL) can be used. However, various light sources capable of emitting light of a predetermined wavelength may be used for each light-emitting element.
 発光素子アレイ112では、例えば、図3に示したように、複数の発光画素PaがN個のチャネルCh1~ChNに区分される。各チャネルChは、駆動回路111において同時に駆動される所定の単位グループに対応する。 In the light-emitting element array 112, for example, as shown in FIG. 3, a number of light-emitting pixels Pa are divided into N channels Ch1 to ChN. Each channel Ch corresponds to a specific unit group that is driven simultaneously by the drive circuit 111.
 駆動回路111は、発光素子アレイ112の各発光画素Paを駆動する。駆動回路111は、個々の発光画素Paを独立して駆動可能に構成されてもよいし、所定の単位グループごとに駆動可能に構成されてもよい。個々の発光画素Paの独立駆動を可能とする場合、駆動回路111は、例えば、複数の発光画素Paのそれぞれに対して一対一に設けられた複数の駆動回路を含み得る。複数の発光画素Paを所定の単位グループごとに駆動可能とする場合、駆動回路111は、例えば、チャネルChごとに1つずつ設けられた複数の駆動回路を含み得る。駆動回路111は、複数の発光画素PaをチャネルChごとに順次駆動するように構成される。つまり、駆動回路111は、複数の発光画素Pa(発光素子)をチャネルChごとに順次発光させるように構成される。 The drive circuit 111 drives each pixel Pa of the light-emitting element array 112. The drive circuit 111 may be configured to be capable of driving each pixel Pa independently, or may be configured to be capable of driving each pixel Pa in a predetermined unit group. When enabling independent driving of each pixel Pa, the drive circuit 111 may include, for example, a plurality of drive circuits provided in a one-to-one correspondence with each of the plurality of pixels Pa. When enabling driving of the plurality of pixels Pa in a predetermined unit group, the drive circuit 111 may include, for example, a plurality of drive circuits provided one for each channel Ch. The drive circuit 111 is configured to sequentially drive the plurality of pixels Pa for each channel Ch. In other words, the drive circuit 111 is configured to sequentially cause the plurality of pixels Pa (light-emitting elements) to emit light for each channel Ch.
 DOE113は、回折光学素子である。DOE113は、光の回折現象を利用した回折格子により、各発光画素Paから出射された光を複数に分岐させる。DOE113は、例えば、図4に示したように、発光素子アレイ112から入射した各光束を3つに分岐させる。これにより、DOE113は、発光素子アレイ112に含まれる発光画素Paの数の3倍の数の発光スポットSpを生成することができるようになっている。なお、DOE113による各光束の分岐の数は3つに限られるものではなく、2つであってもよいし、4つ以上であってもよい。 The DOE 113 is a diffractive optical element. The DOE 113 splits the light emitted from each light-emitting pixel Pa into multiple beams by using a diffraction grating that utilizes the diffraction phenomenon of light. For example, as shown in FIG. 4, the DOE 113 splits each light beam incident from the light-emitting element array 112 into three beams. This allows the DOE 113 to generate three times the number of light-emitting spots Sp as the number of light-emitting pixels Pa included in the light-emitting element array 112. Note that the number of beams split by the DOE 113 is not limited to three, and may be two, or four or more.
 DOE113で生成された光は、カバーガラス130を介して外部に出射される。カバーガラス130を介して外部に出射された光は、照射光Laとなって、被写体に到達する。つまり、発光部110は、被写体に対して照射光Laを出射する。照射光Laは、被写体で反射され、被写体での反射により生成された反射光(入射光Lb)は、カバーガラス130を介して受光部120に入射する。DOE113は、必要に応じて省略されてもよい。 The light generated by the DOE 113 is emitted to the outside through the cover glass 130. The light emitted to the outside through the cover glass 130 becomes irradiation light La and reaches the subject. In other words, the light emitter 110 emits irradiation light La to the subject. The irradiation light La is reflected by the subject, and the reflected light (incident light Lb) generated by the reflection on the subject is incident on the light receiver 120 through the cover glass 130. The DOE 113 may be omitted if necessary.
 受光部120は、照射光Laの被写体への照射によって得られる反射光(入射光Lb)を受光する。受光部120は、駆動回路121と、受光素子アレイ122と、TDC(Time-to-Digital  Converter)123とを有する。 The light receiving unit 120 receives reflected light (incident light Lb) obtained by irradiating the subject with the irradiation light La. The light receiving unit 120 has a drive circuit 121, a light receiving element array 122, and a TDC (Time-to-Digital Converter) 123.
 受光素子アレイ122は、例えば、図5に示したように、複数の受光画素Pbがアレイ状に配列された構成を備える。各受光画素Pbは、例えば、図6に示したように、受光素子(SPAD素子21)と、読出し回路22と、ラッチ26と、出力バッファ27とを有する。 The light receiving element array 122 has a configuration in which a plurality of light receiving pixels Pb are arranged in an array, as shown in FIG. 5, for example. Each light receiving pixel Pb has a light receiving element (SPAD element 21), a readout circuit 22, a latch 26, and an output buffer 27, as shown in FIG. 6, for example.
 SPAD素子21は、ガイガーモードで動作し、そのアノードとカソードとの間に降伏電圧(ブレークダウン電圧)以上の負バイアス電圧VSPAD(例えば、-10V~-20V程度)が印加されている状態でフォトンが入射すると、アバランシェ電流を発生する。 The SPAD element 21 operates in Geiger mode, and generates an avalanche current when a photon is incident while a negative bias voltage VSPAD (e.g., about -10V to -20V) equal to or greater than the breakdown voltage is applied between the anode and cathode.
 読出し回路22は、3つのトランジスタ23~25から構成される。トランジスタ23は、例えば、クエンチ抵抗であり、P型のMOS(Metal-Oxide-Semiconductor)トランジスタで構成され得る。トランジスタ24,25は、例えば、当該受光画素Pbを選択状態とするための選択トランジスタである。トランジスタ24は、例えば、P型のMOSトランジスタで構成され、トランジスタ25は、例えば、N型のMOSトランジスタで構成され得る。 The readout circuit 22 is composed of three transistors 23 to 25. Transistor 23 is, for example, a quench resistor and may be composed of a P-type MOS (Metal-Oxide-Semiconductor) transistor. Transistors 24 and 25 are, for example, selection transistors for selecting the light-receiving pixel Pb. Transistor 24 may be, for example, a P-type MOS transistor, and transistor 25 may be, for example, an N-type MOS transistor.
 クエンチ抵抗であるトランジスタ23のゲートには、例えば、当該トランジスタ23をクエンチ抵抗として作用させるために予め設定されているバイアス電圧(クエンチ電圧ともいう)が印加される。トランジスタ24,25のゲートそれぞれには、例えば、後述するラッチ26からの出力が印加される。 For example, a preset bias voltage (also called a quench voltage) is applied to the gate of transistor 23, which is a quench resistor, to make transistor 23 act as a quench resistor. For example, an output from latch 26, which will be described later, is applied to the gates of transistors 24 and 25.
 ラッチ26は、ラッチ回路261と、NAND回路262と、バッファ263とから構成される。ラッチ回路261は、画素の選択/非選択の情報を保持する。NAND回路262は、ラッチ回路261で保持されている画素の選択/非選択の情報と、カラム選択信号YEとの否定論理積を演算する。この否定論理積の演算結果は、読出し回路22における選択トランジスタ24及び25のゲートに印加されるとともに、バッファ263を介して出力バッファ27の制御端子OEに入力される。 The latch 26 is composed of a latch circuit 261, a NAND circuit 262, and a buffer 263. The latch circuit 261 holds the selection/non-selection information of the pixel. The NAND circuit 262 calculates a negative logical product of the selection/non-selection information of the pixel held in the latch circuit 261 and the column selection signal YE. The result of this negative logical product is applied to the gates of the selection transistors 24 and 25 in the readout circuit 22, and is also input to the control terminal OE of the output buffer 27 via the buffer 263.
 以上のような構成において、受光画素Pbが非選択状態の場合(ラッチ回路261には‘0’が格納)、画素選択信号PXSEL=0となる。このとき、NAND回路262の出力PXE=1となるため、トランジスタ24がオフ状態となり、トランジスタ25がオン状態となって、それにより、SPAD素子21のカソード電位VSが0Vとなる。すなわち、SPAD素子21にはブレークダウン電圧のみがかかる状態となり、フォトンを検出しないモードとなる。 In the above configuration, when the light-receiving pixel Pb is in a non-selected state ('0' is stored in the latch circuit 261), the pixel selection signal PXSEL = 0. At this time, the output PXE of the NAND circuit 262 is PXE = 1, so that the transistor 24 is in an off state and the transistor 25 is in an on state, causing the cathode potential VS of the SPAD element 21 to become 0V. In other words, only the breakdown voltage is applied to the SPAD element 21, and the device enters a mode in which photons are not detected.
 受光画素Pbが選択状態の場合(ラッチ回路261には‘1’が格納)、画素選択信号PXSEL=1となる。このとき、カラム選択信号YE=1であれば、NAND回路262の出力PXE=0となり、トランジスタ24がオン状態となり、トランジスタ25がオフ状態となって、SPAD素子21にブレークダウン電圧以上の電圧がかかる。それにより、SPAD素子21にフォトンが入射することで発生した光電変換による電荷がアバランシェ増倍領域に到達し易い状態となる。 When the light-receiving pixel Pb is in the selected state ('1' is stored in the latch circuit 261), the pixel selection signal PXSEL becomes 1. At this time, if the column selection signal YE = 1, the output of the NAND circuit 262 becomes PXE = 0, the transistor 24 becomes on, the transistor 25 becomes off, and a voltage equal to or higher than the breakdown voltage is applied to the SPAD element 21. This makes it easier for the charge generated by photoelectric conversion when a photon is incident on the SPAD element 21 to reach the avalanche multiplication region.
 フォトンの入射により発生した電荷がアバランシェ増倍領域に到達してアバランシェ電流が発生すると、SPAD素子21のカソード電位VSは0V側に放電され、クエンチ抵抗23の電流源でカソード電位VSが再充電される。 When the charge generated by the incidence of a photon reaches the avalanche multiplication region and an avalanche current is generated, the cathode potential VS of the SPAD element 21 is discharged to the 0V side, and the cathode potential VS is recharged by the current source of the quench resistor 23.
 ただし、ラッチ回路261において受光画素Pbが選択状態とされている場合であっても、当該受光画素Pbが属する領域が選択されていない場合(カラムイ選択信号YE=0)、SPAD素子21の両電極間にはブレークダウン電圧が印加されるものの、受光画素Pb自体が非選択状態となるため、フォトンの検出は実行されない。 However, even if the light-receiving pixel Pb is in a selected state in the latch circuit 261, if the region to which the light-receiving pixel Pb belongs is not selected (column selection signal YE=0), although a breakdown voltage is applied between both electrodes of the SPAD element 21, the light-receiving pixel Pb itself is in a non-selected state, and photon detection is not performed.
 受光画素Pbがフォトンを検出した場合、このフォトンの検出タイミングを立ち上がりエッジとする検出信号PXOUTが、出力バッファ27から出力される。出力バッファ27は、図5に記載の出力回路126に対応する。 When the light-receiving pixel Pb detects a photon, a detection signal PXOUT whose rising edge coincides with the timing of the detection of this photon is output from the output buffer 27. The output buffer 27 corresponds to the output circuit 126 shown in FIG. 5.
 TDC123は、検出信号PXOUTの立ち上がりエッジのタイミング時間をデジタル値に変換する。TDC123は、例えば、発光部110が発光してから反射光(入射光Lb)が受光画素Pbで検出されるまでのフォトンの飛行時間を計測し、計測された時間をデジタルの出力値TDCOUTとして出力する。例えば、TDC123は、受光画素PbごとにTDC回路を有し、照射光Laの出力タイミングから入射光Lbの検出タイミングまでの経過時間を受光画素Pbごとに高分解能(例えば、100ps(ピコ秒)周期程度)で計測し、それにより得られた時間をデジタルの出力値TDCOUTとして出力する。 The TDC 123 converts the timing of the rising edge of the detection signal PXOUT into a digital value. For example, the TDC 123 measures the flight time of a photon from when the light emitting unit 110 emits light until the reflected light (incident light Lb) is detected by the light receiving pixel Pb, and outputs the measured time as a digital output value TDCOUT. For example, the TDC 123 has a TDC circuit for each light receiving pixel Pb, and measures the elapsed time from the output timing of the irradiated light La to the detection timing of the incident light Lb for each light receiving pixel Pb with high resolution (for example, a period of about 100 ps (picoseconds)), and outputs the time obtained thereby as a digital output value TDCOUT.
 駆動回路121は、受光素子アレイ122およびTDC123を制御する。駆動回路121は、例えば、図5に示したように、タイミング制御回路124および駆動回路125を含んで構成される。 The driving circuit 121 controls the light receiving element array 122 and the TDC 123. The driving circuit 121 includes, for example, a timing control circuit 124 and a driving circuit 125, as shown in FIG. 5.
 タイミング制御回路124は、各種のタイミング信号を生成するタイミングジェネレータ等を含み、タイミングジェネレータで生成された各種のタイミング信号を基に、駆動回路125および出力回路126を制御する。 The timing control circuit 124 includes a timing generator that generates various timing signals, and controls the drive circuit 125 and output circuit 126 based on the various timing signals generated by the timing generator.
 駆動回路125は、シフトレジスタやアドレスデコーダなどを含み、受光素子アレイ122の各受光画素Pbを駆動する。駆動回路125は、少なくとも、受光素子アレイ122内の選択された各受光画素Pbに、後述するバイアス電圧(クエンチ電圧)を印加する回路を含む。 The drive circuit 125 includes a shift register, an address decoder, and the like, and drives each light-receiving pixel Pb of the light-receiving element array 122. The drive circuit 125 includes at least a circuit that applies a bias voltage (quench voltage) (described later) to each selected light-receiving pixel Pb in the light-receiving element array 122.
 駆動回路125によって選択された各受光画素Pbから出力される検出信号PXOUTは、出力信号線LSの各々を通して出力回路126に入力される。出力回路126は、駆動回路125によって選択された各受光画素Pbから入力された検出信号PXOUTをTDC123へ出力する。 The detection signal PXOUT output from each light receiving pixel Pb selected by the drive circuit 125 is input to the output circuit 126 through each output signal line LS. The output circuit 126 outputs the detection signal PXOUT input from each light receiving pixel Pb selected by the drive circuit 125 to the TDC 123.
(サンプリング周期)
 ところで、TDC123において、発光部110が発光してから反射光(入射光Lb)が受光画素Pbで検出されるまでのフォトンの飛行時間を計測する周期は、サンプリング周期と呼ばれる。サンプリング周期には、発光部110のPRI周期よりも短い周期が設定される。例えば、サンプリング周期をより短くすることで、より高い時間分解能で、発光部110から出射して被写体で反射したフォトンの飛行時間を推定又は算出することが可能となる。これは、サンプリング周波数をより高くすることで、より高い測距分解能で被写体までの距離を推定又は算出することが可能となることを意味する。
(sampling period)
Incidentally, in the TDC 123, a period for measuring the flight time of photons from when the light-emitting unit 110 emits light until the reflected light (incident light Lb) is detected by the light-receiving pixel Pb is called a sampling period. The sampling period is set to a period shorter than the PRI period of the light-emitting unit 110. For example, by shortening the sampling period, it becomes possible to estimate or calculate the flight time of photons emitted from the light-emitting unit 110 and reflected by the subject with a higher time resolution. This means that by increasing the sampling frequency, it becomes possible to estimate or calculate the distance to the subject with a higher ranging resolution.
 例えば、発光部110が照射光Laを出射して、この照射光Laが被写体で反射し、この反射光(入射光Lb)が受光部120に入射するまでの飛行時間をtとすると、光速Cが一定(C≒300,000,000m(メートル)/s(秒)であることから、被写体までの距離Lは、以下の式(1)ように推定又は算出することができる。
 L=C×t/2    (1)
For example, if the flight time from when the light-emitting unit 110 emits irradiation light La, when this irradiation light La is reflected by the subject, until this reflected light (incident light Lb) is incident on the light-receiving unit 120 is t, then since the speed of light C is constant (C ≈ 300,000,000 m (meters)/s (seconds), the distance L to the subject can be estimated or calculated as shown in the following formula (1).
L = C × t / 2 (1)
 そこで、サンプリング周波数を1GHzとすると、サンプリング周期は1ns(ナノ秒)となる。その場合、1つのサンプリング周期は、15cm(センチメートル)に相当する。これは、サンプリング周波数を1GHzとした場合の測距分解能が15cmであることを示す。また、サンプリング周波数を2倍の2GHzとすると、サンプリング周期は0.5ns(ナノ秒)となるため、1つのサンプリング周期は、7.5cm(センチメートル)に相当する。これは、サンプリング周波数を2倍とした場合、測距分解能を1/2にすることができることを示す。このように、サンプリング周波数を高くしてサンプリング周期を短くすることで、より精度良く、被写体までの距離を推定又は算出することが可能となる。 If the sampling frequency is 1 GHz, the sampling period is 1 ns (nanosecond). In that case, one sampling period corresponds to 15 cm (centimeter). This indicates that the distance measurement resolution when the sampling frequency is 1 GHz is 15 cm. Furthermore, if the sampling frequency is doubled to 2 GHz, the sampling period becomes 0.5 ns (nanosecond), and one sampling period corresponds to 7.5 cm (centimeter). This indicates that the distance measurement resolution can be halved when the sampling frequency is doubled. In this way, by increasing the sampling frequency and shortening the sampling period, it is possible to estimate or calculate the distance to the subject with greater accuracy.
 ヒストグラム生成部140は、TDC123から出力された各受光画素Pbの出力値TDCOUTに基づいて、受光画素Pbごとのヒストグラムを生成する。ヒストグラムは、例えば、発光画素Paを駆動してから1以上の受光画素Pbへの光の入射が検出されるまでの時間に関するものである。ヒストグラム生成部140は、例えば、TDC123から受光画素Pbごとに出力された出力値TDCOUTを、サンプリング周期に対応するBINに格納されているカウント値(累積値)に加算することで、受光画素Pbごとのヒストグラムを生成する。ヒストグラム生成部140は、必要に応じて、生成したヒストグラムに基づいて距離データを算出してもよい。 The histogram generating unit 140 generates a histogram for each light receiving pixel Pb based on the output value TDCOUT of each light receiving pixel Pb output from the TDC 123. The histogram relates to, for example, the time from when the light emitting pixel Pa is driven until the incidence of light on one or more light receiving pixels Pb is detected. The histogram generating unit 140 generates a histogram for each light receiving pixel Pb by, for example, adding the output value TDCOUT output from the TDC 123 for each light receiving pixel Pb to a count value (accumulated value) stored in a BIN corresponding to the sampling period. The histogram generating unit 140 may calculate distance data based on the generated histogram as necessary.
 通信部150は、FPC300を介して情報処理装置200と通信可能に構成された通信インターフェースである。通信部150は、ヒストグラム生成部140において受光画素Pbごとに生成されたヒストグラム、もしくは生成したヒストグラムに基づいて算出した距離データを、送信データとしてFPC300を介して情報処理装置200に送信する。通信部150は、FPC300を介して情報処理装置200から、駆動モード設定情報を受信する。通信部150は、受信した駆動モード設定情報をコントローラ160に出力する。 The communication unit 150 is a communication interface configured to be able to communicate with the information processing device 200 via the FPC 300. The communication unit 150 transmits the histogram generated for each light receiving pixel Pb in the histogram generation unit 140, or distance data calculated based on the generated histogram, as transmission data to the information processing device 200 via the FPC 300. The communication unit 150 receives drive mode setting information from the information processing device 200 via the FPC 300. The communication unit 150 outputs the received drive mode setting information to the controller 160.
 コントローラ160は、発光部110および受光部120を制御する。コントローラ160は、例えば、発光部110の発光および受光部120の受光を制御する。コントローラ160は、例えば、FPC300を介して情報処理装置200から受信した制御情報(例えば、駆動モード設定情報)に基づいて、発光部110および受光部120を制御する。 The controller 160 controls the light-emitting unit 110 and the light-receiving unit 120. The controller 160, for example, controls the light emission of the light-emitting unit 110 and the light reception of the light-receiving unit 120. The controller 160 controls the light-emitting unit 110 and the light-receiving unit 120 based on control information (e.g., drive mode setting information) received from the information processing device 200 via the FPC 300, for example.
 次に、図7を参照して、発光部110における発光強度の切り替えについて説明する。図7は、1フレーム期間内における発光画素Paの発光と受光画素Pbの受光の一例を表したものである。 Next, the switching of the light emission intensity in the light-emitting unit 110 will be described with reference to FIG. 7. FIG. 7 shows an example of the light emission of the light-emitting pixel Pa and the light reception of the light-receiving pixel Pb within one frame period.
 駆動部111は、コントローラ160による制御に従って、1フレーム期間内において、単位グループ(チャネルch)内の複数の発光画素Pa(発光素子)を第1の発光強度で発光させた後第1の発光強度とは異なる第2の発光強度で発光させる。駆動部111は、さらに、コントローラ160による制御に従って、1フレーム期間内において、複数の発光画素Pa(発光素子)を、チャネルch1~chNごとに順次発光させる。第1の発光強度は、例えば、第2の発光強度よりも相対的に弱い発光強度である。第2の発光強度は、例えば、第1の発光強度よりも相対的に強い発光強度である。駆動部111は、情報処理装置200から受信した制御情報(例えば、駆動モード設定情報)に応じて第1の発光強度および第2の発光強度を設定する。 The driving unit 111, in accordance with the control of the controller 160, causes a plurality of light-emitting pixels Pa (light-emitting elements) in a unit group (channel ch) to emit light at a first emission intensity and then at a second emission intensity different from the first emission intensity during one frame period. The driving unit 111 further causes a plurality of light-emitting pixels Pa (light-emitting elements) to emit light sequentially for each of channels ch1 to chN during one frame period, in accordance with the control of the controller 160. The first emission intensity is, for example, a relatively weaker emission intensity than the second emission intensity. The second emission intensity is, for example, a relatively stronger emission intensity than the first emission intensity. The driving unit 111 sets the first emission intensity and the second emission intensity according to control information (for example, drive mode setting information) received from the information processing device 200.
 駆動部111が1フレーム期間内において上述したような発光強度の切り替えを行った場合、発光部110では、例えば、図7に示したように、1フレーム期間内において、単位グループ(チャネルch)内の複数の発光画素Pa(発光素子)が第1の発光強度で発光した後、第2の発光強度で発光する。発光部110では、さらに、例えば、図7に示したように、1フレーム期間内において、複数の発光画素Pa(発光素子)は、チャネルch1~chNごとに順次発光する。 When the driving unit 111 switches the light emission intensity as described above within one frame period, in the light emitting unit 110, for example, as shown in FIG. 7, within one frame period, multiple light emitting pixels Pa (light emitting elements) in a unit group (channel ch) emit light at a first light emission intensity and then emit light at a second light emission intensity. In the light emitting unit 110, for example, as shown in FIG. 7, within one frame period, multiple light emitting pixels Pa (light emitting elements) emit light sequentially for each of channels ch1 to chN.
 このとき、駆動部121は、コントローラ160による制御に従って、発光部110におけるチャネルchの発光(弱発光)に同期して、発光するチャネルchに対応する複数の受光画素Pbに対して露光、読み出し、待機を順次行わせる。さらに、駆動部121は、コントローラ160による制御に従って、発光部110におけるチャネルchの発光(強発光)に同期して、発光するチャネルchに対応する複数の受光画素Pbに対して露光、読み出し、待機を順次行わせる。 At this time, the drive unit 121, in accordance with the control of the controller 160, causes the multiple light receiving pixels Pb corresponding to the emitting channel ch to sequentially perform exposure, readout, and standby in synchronization with the light emission (weak light emission) of the channel ch in the light emitting unit 110. Furthermore, the drive unit 121, in accordance with the control of the controller 160, causes the multiple light receiving pixels Pb corresponding to the emitting channel ch to sequentially perform exposure, readout, and standby in synchronization with the light emission (strong light emission) of the channel ch in the light emitting unit 110.
 駆動部111は、さらに、コントローラ160による制御に従って、1フレーム期間内において、複数の受光画素Pb(受光素子)に対して、発光(弱発光)するチャネルchに対応する複数の受光画素Pbごとに、露光、読み出し、待機を順次行わせる。駆動部111は、さらに、コントローラ160による制御に従って、1フレーム期間内において、複数の受光画素Pb(受光素子)に対して、発光(強発光)するチャネルchに対応する複数の受光画素Pbごとに、露光、読み出し、待機を順次行わせる。 The driving unit 111 further causes the multiple light receiving pixels Pb (light receiving elements) to sequentially perform exposure, readout, and standby for each of the multiple light receiving pixels Pb corresponding to the channel ch that emits light (weak light emission) during one frame period in accordance with the control of the controller 160.The driving unit 111 further causes the multiple light receiving pixels Pb (light receiving elements) to sequentially perform exposure, readout, and standby for each of the multiple light receiving pixels Pb corresponding to the channel ch that emits light (strong light emission) during one frame period in accordance with the control of the controller 160.
 駆動部121が、発光(弱発光)するチャネルchに対応する複数の受光画素Pb(受光素子)に対して、露光、読み出し、待機を行わせた場合、受光部120では、例えば、図7に示したように、単位グループ(チャネルch)に対応する複数の受光画素Pb(受光素子)が露光、読み出し、待機を順次行う。駆動部121が、発光(強発光)するチャネルchに対応する複数の受光画素Pbに対して、露光、読み出し、待機を順次行わせた場合、受光部120では、例えば、単位グループ(チャネルch)に対応する複数の受光画素Pb(受光素子)が露光、読み出し、待機を順次行う。 When the driving unit 121 causes multiple light receiving pixels Pb (light receiving elements) corresponding to a channel ch that emits light (weak light emission) to perform exposure, readout, and standby, the multiple light receiving pixels Pb (light receiving elements) corresponding to a unit group (channel ch) in the light receiving unit 120, for example, as shown in FIG. 7, perform exposure, readout, and standby. When the driving unit 121 causes multiple light receiving pixels Pb corresponding to a channel ch that emits light (strong light emission) to perform exposure, readout, and standby, the multiple light receiving pixels Pb (light receiving elements) corresponding to a unit group (channel ch) in the light receiving unit 120, for example, perform exposure, readout, and standby.
 ヒストグラム生成部140は、TDC123から出力された、発光(弱発光)するチャネルchに対応する複数の受光画素Pb(受光素子)の出力値TDCOUTに基づいて、受光画素Pbごとのヒストグラム(被写体までの距離に関する第1のデータ)を生成する。ヒストグラム生成部140は、例えば、TDC123から発光(弱発光)するチャネルchに対応する複数の受光画素Pb(受光素子)ごとに出力された出力値TDCOUTを、サンプリング周期に対応するBINに格納されているカウント値(累積値)に加算することで、受光画素Pbごとのヒストグラム(被写体までの距離に関する第1のデータ)を生成する。ヒストグラム生成部140は、必要に応じて、生成したヒストグラム(被写体までの距離に関する第1のデータ)に基づいて距離データを算出してもよい。 The histogram generating unit 140 generates a histogram (first data related to the distance to the subject) for each light receiving pixel Pb based on the output value TDCOUT of the multiple light receiving pixels Pb (light receiving elements) corresponding to the channel ch that emits light (weak light emission) output from the TDC 123. The histogram generating unit 140 generates a histogram (first data related to the distance to the subject) for each light receiving pixel Pb by, for example, adding the output value TDCOUT output from the TDC 123 for each light receiving pixel Pb (light receiving element) corresponding to the channel ch that emits light (weak light emission) to a count value (accumulated value) stored in a BIN corresponding to the sampling period. The histogram generating unit 140 may calculate distance data based on the generated histogram (first data related to the distance to the subject) as necessary.
 ヒストグラム生成部140は、TDC123から出力された、発光(強発光)するチャネルchに対応する複数の受光画素Pb(受光素子)の出力値TDCOUTに基づいて、受光画素Pbごとのヒストグラム(被写体までの距離に関する第2のデータ)を生成する。ヒストグラム生成部140は、例えば、TDC123から発光(強発光)するチャネルchに対応する複数の受光画素Pb(受光素子)ごとに出力された出力値TDCOUTを、サンプリング周期に対応するBINに格納されているカウント値(累積値)に加算することで、受光画素Pbごとのヒストグラム(被写体までの距離に関する第2のデータ)を生成する。ヒストグラム生成部140は、必要に応じて、生成したヒストグラム(被写体までの距離に関する第2のデータ)に基づいて距離データを算出してもよい。 The histogram generating unit 140 generates a histogram (second data related to the distance to the subject) for each light receiving pixel Pb based on the output values TDCOUT of the multiple light receiving pixels Pb (light receiving elements) corresponding to the channel ch that emits light (strong light emission) output from the TDC 123. The histogram generating unit 140 generates a histogram (second data related to the distance to the subject) for each light receiving pixel Pb by, for example, adding the output values TDCOUT output from the TDC 123 for each light receiving pixel Pb (light receiving elements) corresponding to the channel ch that emits light (strong light emission) to a count value (accumulated value) stored in a BIN corresponding to the sampling period. The histogram generating unit 140 may calculate distance data based on the generated histogram (second data related to the distance to the subject) as necessary.
 通信部150は、ヒストグラム生成部140で算出された第1のデータおよび第2のデータのうち少なくとも一方と、発光部110の発光強度についてのデータとを所定のフォーマットの送信データとして出力する。通信部150は、必要に応じて、第1のデータおよび第2のデータのうち少なくとも一方に対応する距離データと、発光部110の発光強度についてのデータとを所定のフォーマットの送信データとして出力してもよい。 The communication unit 150 outputs at least one of the first data and the second data calculated by the histogram generation unit 140 and data on the light emission intensity of the light-emitting unit 110 as transmission data in a predetermined format. If necessary, the communication unit 150 may output distance data corresponding to at least one of the first data and the second data and data on the light emission intensity of the light-emitting unit 110 as transmission data in a predetermined format.
(Pile-Up現象)
 図8は、ヒストグラム生成部140によって生成されるヒストグラムの一例を表したものである。被写体が相対的に近距離にある場合、弱発光で得られたヒストグラムは、例えば、図8(A)に示したように、距離1mに対応するBIN番号よりも小さなBIN番号においてピーク値を有する。このときにピーク値が所定の閾値よりも大きいとき、そのピーク値に対応する距離が被写体までの距離に相当する。なお、図8(A),図8(B)には、近距離か否かの境界として、1mが例示されているが、その境界は1mに限られるものではない。
(Pile-Up phenomenon)
Fig. 8 shows an example of a histogram generated by the histogram generating unit 140. When the subject is relatively close, the histogram obtained with weak light emission has a peak value at a BIN number smaller than the BIN number corresponding to a distance of 1 m, as shown in Fig. 8(A), for example. When the peak value is greater than a predetermined threshold value at this time, the distance corresponding to the peak value corresponds to the distance to the subject. Note that Figs. 8(A) and 8(B) show 1 m as an example of the boundary between close and not close distances, but the boundary is not limited to 1 m.
 被写体が相対的に近距離にある場合、強発光で得られたヒストグラムは、例えば、図8(B)に示したように、距離1mに対応するBIN番号よりも小さなBIN番号においてピーク値を有する。しかし、このときには、いわゆるPile-Up現象が生じているため、このときのピーク値が所定の閾値よりも大きい場合であっても、そのピーク値に対応する距離は、被写体までの実際の距離よりも短い値になってしまう。このPile-Up現象では、例えば、図9に示したように、適切な強度の入射光Lbが受光素子アレイ122に入射したときに得られるピーク値に対応するBIN番号(第1のBIN番号)と、強い強度の入射光Lbが受光素子アレイ122に入射したときに得られるピーク値に対応するBIN番号(第2のBIN番号)を対比すると、第2のBIN番号は、第1のBIN番号と比べて所定の大きさ(ΔX)だけ小さくなってしまう。従って、被写体が相対的に近距離にある場合には、強発光で得られたヒストグラムではなく、弱発光で得られたヒストグラムに基づいて被写体までの距離を導出することが望ましい。 When the subject is relatively close, the histogram obtained with strong light emission has a peak value at a BIN number smaller than the BIN number corresponding to a distance of 1 m, as shown in FIG. 8B, for example. However, at this time, the so-called pile-up phenomenon occurs, so even if the peak value at this time is greater than a predetermined threshold, the distance corresponding to the peak value is shorter than the actual distance to the subject. In this pile-up phenomenon, for example, as shown in FIG. 9, when comparing the BIN number (first BIN number) corresponding to the peak value obtained when incident light Lb of appropriate intensity is incident on the light receiving element array 122 with the BIN number (second BIN number) corresponding to the peak value obtained when incident light Lb of strong intensity is incident on the light receiving element array 122, the second BIN number is smaller than the first BIN number by a predetermined magnitude (ΔX). Therefore, when the subject is relatively close, it is desirable to derive the distance to the subject based on the histogram obtained with weak light emission, not the histogram obtained with strong light emission.
 なお、入射光Lbが受光素子アレイ122に入射し、受光素子が反応すると、反応した受光素子は、その後一定時間、原理的に光検知できない時間(デッドタイム)に入る。例えば、強い強度の照射光Laが発光画素Paから出射されると、カバーガラス130で反射される光も強い強度となり、受光素子に入射してしまう。このとき、受光素子が高い確率で反応してその後デッドタイムに入ってしまうため、例えば、図10(A)に示したように、近距離の被写体で反射された光を受光素子で検知することができない。 When incident light Lb enters the light receiving element array 122 and a light receiving element reacts, the reacting light receiving element then enters a certain period of time during which it is in principle unable to detect light (dead time). For example, when strong intensity illumination light La is emitted from the light emitting pixel Pa, the light reflected by the cover glass 130 also becomes strong and enters the light receiving element. At this time, there is a high probability that the light receiving element will react and then enter the dead time, so for example, as shown in Figure 10 (A), the light receiving element will not be able to detect light reflected from a nearby subject.
 一方、適度な強度の照射光Laが発光画素Paから出射された場合、カバーガラス130で反射される光による、受光素子の反応確率も抑えられるため、何回も発光・測距を繰り返す測距制御においては、例えば、図10(B)に示したように、比較的近距離での測距も可能になる。 On the other hand, when irradiation light La of a moderate intensity is emitted from the light-emitting pixel Pa, the probability of the light receiving element reacting to the light reflected by the cover glass 130 is also reduced, so in distance measurement control that repeats light emission and distance measurement many times, it becomes possible to measure distances at relatively short distances, for example, as shown in Figure 10 (B).
 以上のことから、被写体が相対的に近距離にある場合には、強発光で得られたヒストグラムではなく、適度な強度(例えば弱強度)の発光で得られたヒストグラムに基づいて被写体までの距離を導出することが望ましい。 In light of the above, when the subject is relatively close, it is desirable to derive the distance to the subject based on a histogram obtained with light of moderate intensity (e.g., weak intensity) rather than a histogram obtained with strong light.
 被写体が相対的に遠距離にある場合、弱発光で得られたヒストグラムは、例えば、図8(C)に示したように、閾値を超えるピーク値を有しない。これは、反射光の強度が不足しているためである。一方、強発光で得られたヒストグラムは、例えば、図8(D)に示したように、近距離か否かの境界(1m)を超える箇所に閾値を超えるピーク値を有している。なお、図8(C),図8(D)には、近距離か否かの境界として、1mが例示されているが、その境界は1mに限られるものではない。従って、被写体が相対的に遠距離にある場合には、弱発光で得られたヒストグラムではなく、強発光で得られたヒストグラムに基づいて被写体までの距離を導出することが望ましい。なお、被写体が相対的に遠距離にある場合には、Pile-Up現象による影響はほとんど無い。 When the subject is at a relatively long distance, the histogram obtained with weak light emission does not have a peak value exceeding the threshold, as shown in FIG. 8(C), for example. This is because the intensity of the reflected light is insufficient. On the other hand, the histogram obtained with strong light emission has a peak value exceeding the threshold beyond the boundary (1 m) between close and not close distances, as shown in FIG. 8(D), for example. Note that although 1 m is shown as an example of the boundary between close and not close distances in FIG. 8(C) and FIG. 8(D), the boundary is not limited to 1 m. Therefore, when the subject is at a relatively long distance, it is desirable to derive the distance to the subject based on the histogram obtained with strong light emission, rather than the histogram obtained with weak light emission. Note that when the subject is at a relatively long distance, there is almost no effect from the Pile-Up phenomenon.
[効果]
 次に、測距装置100の効果について説明する。
[effect]
Next, the effects of the distance measuring device 100 will be described.
 本実施の形態では、1フレーム期間内において、単位グループ(チャネルch)内の複数の発光画素Pa(発光素子)が第1の発光強度で発光された後、第1の発光強度とは異なる第2の発光強度で発光される。これにより、例えば、被写体が近距離にある場合であっても、Pile-Up現象に起因する測定誤差が少なく、カバーガラスでの反射によるデッドタイムの影響がほとんど無いデータが得られる。また、例えば、被写体が遠距離にある場合であっても、被写体までの距離を算出することの可能なデータが得られる。従って、単一強度で発光する場合と比べて、被写体までの距離を精度良く算出することができる。 In this embodiment, in one frame period, multiple light-emitting pixels Pa (light-emitting elements) in a unit group (channel ch) emit light at a first emission intensity, and then emit light at a second emission intensity different from the first emission intensity. As a result, even if the subject is at a close distance, for example, data can be obtained that has little measurement error due to the Pile-Up phenomenon and is almost free of the effects of dead time due to reflection on the cover glass. Furthermore, even if the subject is at a long distance, for example, data can be obtained that allows the distance to the subject to be calculated. Therefore, the distance to the subject can be calculated with greater accuracy than when light is emitted at a single intensity.
 本実施の形態では、TDC123から出力された、発光(弱発光)するチャネルchに対応する複数の受光画素Pb(受光素子)の出力値TDCOUTに基づいて、受光画素Pbごとのヒストグラム(被写体までの距離に関する第1のデータ)が生成される。さらに、TDC123から出力された、発光(強発光)するチャネルchに対応する複数の受光画素Pb(受光素子)の出力値TDCOUTに基づいて、受光画素Pbごとのヒストグラム(被写体までの距離に関する第2のデータ)が生成される。これにより、例えば、被写体が近距離にある場合であっても、Pile-Up現象に起因する測定誤差が少なく、カバーガラスでの反射によるデッドタイムの影響のほとんど無いデータ(第1のデータ)が得られる。また、例えば、被写体が遠距離にある場合であっても、被写体までの距離を算出することの可能なデータ(第2のデータ)が得られる。従って、単一強度で発光する場合と比べて、被写体までの距離を精度良く算出することができる。 In this embodiment, a histogram (first data related to the distance to the subject) is generated for each light receiving pixel Pb based on the output value TDCOUT of the multiple light receiving pixels Pb (light receiving elements) corresponding to the channel ch that emits light (weak light emission) output from the TDC 123. Furthermore, a histogram (second data related to the distance to the subject) is generated for each light receiving pixel Pb based on the output value TDCOUT of the multiple light receiving pixels Pb (light receiving elements) corresponding to the channel ch that emits light (strong light emission) output from the TDC 123. As a result, for example, even if the subject is at a close distance, data (first data) is obtained that has little measurement error due to the Pile-Up phenomenon and is almost free of the influence of dead time due to reflection on the cover glass. Also, for example, even if the subject is at a long distance, data (second data) that can be used to calculate the distance to the subject is obtained. Therefore, the distance to the subject can be calculated with higher accuracy than when light is emitted at a single intensity.
 本実施の形態では、ヒストグラム生成部140で算出された第1のデータおよび第2のデータのうち少なくとも一方と、発光部110の発光強度についてのデータとが所定のフォーマットの出力データ(送信データ)として出力される。これにより、情報処理装置200において、送信データに基づいて、被写体までの距離を精度良く算出することができる。 In this embodiment, at least one of the first data and the second data calculated by the histogram generating unit 140 and data on the light emission intensity of the light emitting unit 110 are output as output data (transmission data) in a predetermined format. This allows the information processing device 200 to accurately calculate the distance to the subject based on the transmission data.
<2.変形例>
 次に、上記実施の形態に係る測距装置100の変形例について説明する。
2. Modified Examples
Next, a modification of the distance measuring device 100 according to the above embodiment will be described.
[変形例A]
 上記実施の形態において、測距装置100は、例えば、図11に示したように、記憶部170を備えていてもよい。記憶部170は、例えば、RAM(Random Access Memory)、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス、もしくは、光磁気記憶デバイス等によって構成される。記憶部170には、例えば、図12に示したような、駆動モードに応じたMP設定データが規定されたデータセットが記憶されてもよい。このデータセットについては後に詳述する。
[Modification A]
In the above embodiment, the distance measuring device 100 may include a storage unit 170, for example, as shown in Fig. 11. The storage unit 170 may be configured, for example, with a random access memory (RAM), a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage unit 170 may store a data set in which MP setting data corresponding to a drive mode is defined, for example, as shown in Fig. 12. This data set will be described in detail later.
 図12に例示した各種駆動モードは、例えば、図13に示した条件および特徴となっている。図13に示したように、駆動モードがDual1またはDual2である場合に、コントローラ160は、駆動部111に対して発光強度の切り替えを指示する。また、図13に示したように、駆動モードがSingle1またはSingle2である場合に、コントローラ160は、駆動部111に対して単一の発光強度での駆動を指示する。 The various drive modes illustrated in FIG. 12 have, for example, the conditions and features shown in FIG. 13. As shown in FIG. 13, when the drive mode is Dual1 or Dual2, the controller 160 instructs the drive unit 111 to switch the light emission intensity. Also, as shown in FIG. 13, when the drive mode is Single1 or Single2, the controller 160 instructs the drive unit 111 to drive at a single light emission intensity.
 本変形例では、情報処理装置200(プロセッサ220)は、測距装置100に対して、駆動モード設定情報を送信する。測距装置100(コントローラ160)は、情報処理装置200(プロセッサ220)から駆動モード設定情報を受信すると、受信した駆動モード設定情報に応じて発光強度を設定することを駆動部111に指示する。測距装置100(コントローラ160)は、情報処理装置200(プロセッサ220)から取得した駆動モード設定情報が、被写体が相対的に遠距離に位置することを想定したモード(例えば、Dual1、Dual2)である場合に、1フレーム期間内において、単位グループ(チャネルch)内の複数の発光画素Pa(発光素子)を第1の発光強度で発光させた後第2の発光強度で発光させることを駆動部111に指示する。これにより、被写体の距離に応じた駆動モードを選択することができるので、被写体までの距離を精度良く算出することができる。 In this modified example, the information processing device 200 (processor 220) transmits drive mode setting information to the distance measuring device 100. When the distance measuring device 100 (controller 160) receives the drive mode setting information from the information processing device 200 (processor 220), it instructs the drive unit 111 to set the light emission intensity according to the received drive mode setting information. When the drive mode setting information acquired from the information processing device 200 (processor 220) is a mode (e.g., Dual1, Dual2) that assumes that the subject is located at a relatively long distance, the distance measuring device 100 (controller 160) instructs the drive unit 111 to make the multiple light emitting pixels Pa (light emitting elements) in the unit group (channel ch) emit light at a first light emission intensity and then at a second light emission intensity during one frame period. This makes it possible to select a drive mode according to the distance to the subject, and therefore to accurately calculate the distance to the subject.
 次に、記憶部170に格納されたデータセットについて説明する。 Next, we will explain the data set stored in the memory unit 170.
 このデータセットにおいて、MP設定データとは、例えば、駆動モードに応じた、アクティブにする受光画素Pbのアドレスデータを指す。このアドレスデータは、受光素子アレイ122に含まれる複数の受光画素Pb(受光素子)のうち、一部の複数の受光画素Pb(受光素子)のアドレスデータを指す。被写体が相対的に近距離に位置することを想定したモード(例えば、Dual1、Single2、Dual2)に対しては、被写体が相対的に遠距離に位置することを想定したモード(例えば、Single1)に対するアドレスデータとは異なるアドレスデータが割り当てられる。 In this data set, the MP setting data refers to, for example, address data of the light receiving pixels Pb to be activated according to the drive mode. This address data refers to address data of some of the light receiving pixels Pb (light receiving elements) among the light receiving pixels Pb (light receiving elements) included in the light receiving element array 122. Address data that is different from the address data assigned to a mode that is assigned to a subject that is relatively close (for example, Dual1, Single2, Dual2) is assigned to a mode that is assigned to a subject that is relatively far away (for example, Single1).
 駆動モードがDual1である場合、コントローラ160は、記憶部170のデータセットから、Dual1に対応するMP設定データdata1を読み出し、読み出したMP設定データdata1に対応する複数の受光画素Pb(受光素子)をアクティブにすることを駆動部121に指示する。駆動部121は、複数の受光画素Pbのうち、MP設定データdata1に対応する複数の受光画素Pbをアクティブにすることにより、アクティブになっている複数の受光画素Pbから受光データを出力させる。 When the drive mode is Dual1, the controller 160 reads out the MP setting data data1 corresponding to Dual1 from the data set in the memory unit 170, and instructs the drive unit 121 to activate the multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data data1. The drive unit 121 activates the multiple light receiving pixels Pb that correspond to the MP setting data data1 among the multiple light receiving pixels Pb, thereby causing the multiple active light receiving pixels Pb to output light receiving data.
 駆動モードがSingle1である場合、コントローラ160は、記憶部170のデータセットから、Single1に対応するMP設定データdata2を読み出し、読み出したMP設定データdata2に対応する複数の受光画素Pb(受光素子)をアクティブにすることを駆動部121に指示する。駆動部121は、複数の受光画素Pbのうち、MP設定データdata2に対応する複数の受光画素Pbをアクティブにすることにより、アクティブになっている複数の受光画素Pbから受光データを出力させる。 When the drive mode is Single1, the controller 160 reads out the MP setting data data2 corresponding to Single1 from the data set in the memory unit 170, and instructs the drive unit 121 to activate the multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data data2. The drive unit 121 activates the multiple light receiving pixels Pb that correspond to the MP setting data data2 among the multiple light receiving pixels Pb, thereby causing the multiple active light receiving pixels Pb to output light receiving data.
 駆動モードがSingle2である場合、コントローラ160は、記憶部170のデータセットから、Single2に対応するMP設定データdata3を読み出し、読み出したMP設定データdata3に対応する複数の受光画素Pb(受光素子)をアクティブにすることを駆動部121に指示する。駆動部121は、複数の受光画素Pbのうち、MP設定データdata3に対応する複数の受光画素Pbをアクティブにすることにより、アクティブになっている複数の受光画素Pbから受光データを出力させる。 When the drive mode is Single2, the controller 160 reads out the MP setting data data3 corresponding to Single2 from the data set in the memory unit 170, and instructs the drive unit 121 to activate the multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data data3. The drive unit 121 activates the multiple light receiving pixels Pb that correspond to the MP setting data data3 among the multiple light receiving pixels Pb, thereby causing the multiple active light receiving pixels Pb to output light receiving data.
 駆動モードがDual2である場合、コントローラ160は、記憶部170のデータセットから、Dual2に対応するMP設定データdata4を読み出し、読み出したMP設定データdata4に対応する複数の受光画素Pb(受光素子)をアクティブにすることを駆動部121に指示する。駆動部121は、複数の受光画素Pbのうち、MP設定データdata4に対応する複数の受光画素Pbをアクティブにすることにより、アクティブになっている複数の受光画素Pbから受光データを出力させる。 When the drive mode is Dual2, the controller 160 reads out the MP setting data data4 corresponding to Dual2 from the data set in the memory unit 170, and instructs the drive unit 121 to activate the multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data data4. The drive unit 121 activates the multiple light receiving pixels Pb that correspond to the MP setting data data4 among the multiple light receiving pixels Pb, thereby causing the multiple active light receiving pixels Pb to output light receiving data.
 ところで、上述したように、受光素子アレイ122において、複数の受光画素Pb(受光素子)のうち一部だけがアクティブになる場合、アクティブな複数の受光画素Pb(受光素子)を含む領域(アクティブ画素領域MP)が、例えば、図14に示したように、被写体までの距離によらずに固定となっているとする。その場合、入射光Lbが照射される領域(照射スポットRb)の位置が、視差の影響により、例えば、図14に示したように、被写体までの距離に応じて変位するため、照射スポットRbがアクティブ画素領域MPを外れてしまう可能性がある。照射スポットRbがアクティブ画素領域MPを外れた場合、被写体を検出することができない。 As described above, in the light receiving element array 122, when only a portion of the multiple light receiving pixels Pb (light receiving elements) are active, the region (active pixel region MP) that includes the multiple active light receiving pixels Pb (light receiving elements) is fixed regardless of the distance to the subject, for example, as shown in FIG. 14. In that case, the position of the region (illumination spot Rb) onto which the incident light Lb is irradiated will be displaced depending on the distance to the subject due to the effects of parallax, for example, as shown in FIG. 14, and the illumination spot Rb may fall outside the active pixel region MP. If the illumination spot Rb falls outside the active pixel region MP, the subject cannot be detected.
 そこで、例えば、図15に示したように、受光素子アレイ122におけるアクティブ画素領域MPの面積を拡げることが考えられる。しかし、そのようにした場合には、アクティブ画素領域MPの面積を拡げた分だけ受光素子アレイ122における電力消費量が増大する。 As a result, it is possible to consider, for example, expanding the area of the active pixel region MP in the light receiving element array 122, as shown in FIG. 15. However, doing so would increase the amount of power consumed by the light receiving element array 122 by the amount of expansion of the area of the active pixel region MP.
 一方、本変形例では、コントローラ160は、情報処理装置200(プロセッサ220)から駆動モード設定情報を受信すると、受信した駆動モード設定情報に対応するMP設定データを、記憶部170のデータセットから読み出す。コントローラ160は、受光素子アレイ122に含まれる複数の受光画素Pb(受光素子)のうち、読み出したMP設定データに対応する複数の受光画素Pb(受光素子)をアクティブにすることを駆動部121に指示する。これにより、コントローラ160は、アクティブになっている複数の受光画素Pb(受光素子)から受光データを出力させることができる。 On the other hand, in this modified example, when the controller 160 receives drive mode setting information from the information processing device 200 (processor 220), it reads out MP setting data corresponding to the received drive mode setting information from the data set in the storage unit 170. The controller 160 instructs the drive unit 121 to activate a plurality of light receiving pixels Pb (light receiving elements) that correspond to the read MP setting data, among the plurality of light receiving pixels Pb (light receiving elements) included in the light receiving element array 122. This allows the controller 160 to output light receiving data from the plurality of active light receiving pixels Pb (light receiving elements).
 駆動部121は、コントローラ160からの制御に従って、駆動モードに応じたアクティブ画素領域MPを設定する。これにより、受光素子アレイ122において、入射光Lbが入射する領域に含まれる複数の受光画素Pbを選択的にアクティブにすることができる。その結果、受光素子アレイ122に含まれる全ての受光画素Pbをアクティブにした場合と比べて、受光素子アレイ122における電力消費を低く抑えることができる。また、被写体までの距離によらず、照射スポットRbの少なくとも一部を、アクティブ画素領域MPの少なくとも一部に重ならせることができる。その結果、受光素子アレイ122における電力消費を低く抑えつつ、被写体までの距離によらず、被写体を確実に検出することができる。従って、被写体までの距離を精度よく導出することができる。 The driving unit 121 sets an active pixel region MP according to the driving mode in accordance with control from the controller 160. This allows selective activation of multiple light receiving pixels Pb included in the area where the incident light Lb is incident in the light receiving element array 122. As a result, power consumption in the light receiving element array 122 can be kept low compared to when all light receiving pixels Pb included in the light receiving element array 122 are activated. Furthermore, regardless of the distance to the subject, at least a portion of the irradiation spot Rb can be made to overlap at least a portion of the active pixel region MP. As a result, the subject can be reliably detected regardless of the distance to the subject while keeping power consumption in the light receiving element array 122 low. Therefore, the distance to the subject can be derived with high accuracy.
 本変形例では、さらに、コントローラ160は、情報処理装置200(プロセッサ220)から駆動モード設定情報を受信すると、受信した駆動モード設定情報に対応するMP設定データを記憶部170のデータセットから読み出し、読み出したMP設定データに対応する複数の受光画素Pb(受光素子)をアクティブにすることを駆動部121に指示する。これにより、コントローラ160は、情報処理装置200(プロセッサ220)から取得した駆動モードに応じてアクティブにする複数の受光画素Pb(受光素子)を決定することができる。 Furthermore, in this modified example, when the controller 160 receives drive mode setting information from the information processing device 200 (processor 220), it reads MP setting data corresponding to the received drive mode setting information from the data set in the storage unit 170, and instructs the drive unit 121 to activate multiple light receiving pixels Pb (light receiving elements) corresponding to the read MP setting data. This allows the controller 160 to determine multiple light receiving pixels Pb (light receiving elements) to be activated according to the drive mode obtained from the information processing device 200 (processor 220).
 情報処理装置200(プロセッサ220)から取得した駆動モードが、被写体が相対的に遠距離に位置することを想定したモード(例えば、Dual1,Single1)である場合、コントローラ160は、例えば図16(A)に示した位置にアクティブ画素領域MPを設定する。情報処理装置200(プロセッサ220)から取得した駆動モードが、被写体が相対的に中距離に位置することを想定したモード(例えば、Dual2)である場合、コントローラ160は、例えば図16(B)に示した位置にアクティブ画素領域MPを設定する。情報処理装置200(プロセッサ220)から取得した駆動モードが、被写体が相対的に近距離に位置することを想定したモード(例えば、Single2)である場合、コントローラ160は、例えば図16(C)に示した位置にアクティブ画素領域MPを設定する。 If the drive mode acquired from the information processing device 200 (processor 220) is a mode (e.g., Dual1, Single1) that assumes that the subject is located at a relatively long distance, the controller 160 sets the active pixel area MP, for example, at the position shown in FIG. 16(A). If the drive mode acquired from the information processing device 200 (processor 220) is a mode (e.g., Dual2) that assumes that the subject is located at a relatively medium distance, the controller 160 sets the active pixel area MP, for example, at the position shown in FIG. 16(B). If the drive mode acquired from the information processing device 200 (processor 220) is a mode (e.g., Single2) that assumes that the subject is located at a relatively short distance, the controller 160 sets the active pixel area MP, for example, at the position shown in FIG. 16(C).
 このようにした場合、駆動モードによらず、照射スポットRbの少なくとも一部を、アクティブ画素領域MPの少なくとも一部に重ならせることができる。その結果、受光素子アレイ122における電力消費を低く抑えつつ、被写体までの距離によらず、被写体を確実に検出することができる。また、情報処理装置200(プロセッサ220)からの制御によって駆動モードを動的に変更した場合であっても、受光素子アレイ122における電力消費を低く抑えつつ、駆動モードによらず、被写体を確実に検出することができる。従って、被写体までの距離を精度よく導出することができる。 In this case, at least a portion of the irradiation spot Rb can be made to overlap at least a portion of the active pixel region MP regardless of the drive mode. As a result, the power consumption in the light receiving element array 122 can be kept low and the subject can be reliably detected regardless of the distance to the subject. Furthermore, even if the drive mode is dynamically changed under control of the information processing device 200 (processor 220), the power consumption in the light receiving element array 122 can be kept low and the subject can be reliably detected regardless of the drive mode. Therefore, the distance to the subject can be derived with high accuracy.
[変形例B]
 上記実施の形態およびその変形例において、ヒストグラム生成部140が、生成したヒストグラムに基づいて距離データを算出してもよい。この場合、コントローラ160は、ヒストグラム生成部140によって算出された距離データに基づいて、駆動モードを動的に設定してもよい。
[Modification B]
In the above embodiment and its modified examples, the histogram generating unit 140 may calculate distance data based on the generated histogram. In this case, the controller 160 may dynamically set the drive mode based on the distance data calculated by the histogram generating unit 140.
 図17は、各種駆動モードの一例を表したものである。図17には、駆動モードとして、基本モード、近距離モード、中距離モードの3つのモードが記載されている。コントローラ160は、ヒストグラム生成部140によって算出された距離データに基づいて、例えば、図17に記載の3つの駆動モードのうちのいずれかを動的に設定する。 FIG. 17 shows an example of various drive modes. Three drive modes are shown in FIG. 17: basic mode, close distance mode, and medium distance mode. The controller 160 dynamically sets, for example, one of the three drive modes shown in FIG. 17 based on the distance data calculated by the histogram generating unit 140.
 図18は、図17に記載の3つの駆動モードのうちのいずれかを動的に設定するための手順の一例を表したものである。コントローラ160は、まず、図17に記載の基本モードにて、駆動部111,121を制御する(ステップS101)。コントローラ160は、その際、ヒストグラム生成部140によって算出された距離データDを取得する(ステップS102)。距離データDが3mよりも短く、かつ、1mよりも短い場合(ステップS103:Y、ステップS104:Y)、コントローラ160は、駆動モードを、図17に記載の中距離モードにて、駆動部111,121を制御する(ステップS105)。距離データDが3mよりも短く、かつ、1m以上となっている場合(ステップS103:Y、ステップS104:N)、コントローラ160は、駆動モードを、図17に記載の短距離モードにて、駆動部111,121を制御する(ステップS106)。 FIG. 18 shows an example of a procedure for dynamically setting one of the three drive modes shown in FIG. 17. First, the controller 160 controls the drive units 111 and 121 in the basic mode shown in FIG. 17 (step S101). At that time, the controller 160 acquires the distance data D calculated by the histogram generating unit 140 (step S102). If the distance data D is shorter than 3 m and shorter than 1 m (step S103: Y, step S104: Y), the controller 160 controls the drive units 111 and 121 in the medium distance mode shown in FIG. 17 (step S105). If the distance data D is shorter than 3 m and longer than 1 m (step S103: Y, step S104: N), the controller 160 controls the drive units 111 and 121 in the short distance mode shown in FIG. 17 (step S106).
 このように、本変形例では、ヒストグラム生成部140によって算出された距離データに基づいて、駆動モードが動的に設定される。これにより、被写体までの距離を精度よく導出することができる。 In this manner, in this modified example, the drive mode is dynamically set based on the distance data calculated by the histogram generating unit 140. This makes it possible to derive the distance to the subject with high accuracy.
[変形例C]
 上記実施の形態およびその変形例において、測距装置100は、例えば、図19に示したように、信頼度C(スコア値)を計算する信頼度計算部180を更に備えてもよい。信頼度Cは、被写体が近距離にある場合に有用である。
[Modification C]
In the above embodiment and its modified examples, the distance measuring device 100 may further include a reliability calculation unit 180 that calculates a reliability C (score value) as shown in Fig. 19. The reliability C is useful when the subject is at a close distance.
 例えば、図20(A)に示したように、被写体が近距離にある場合に弱発光によって得られたヒストグラム(第1のデータ)のピークに対応する距離が、近距離か否かの境界(例えば1m)よりもほんのわずかに大きくなっているとする。さらに、例えば、図20(B)に示したように、被写体が近距離にある場合に強発光によって得られたヒストグラム(第2のデータ)のピークがPile-Up現象によって、近距離か否かの境界(例えば1m)よりもほんのわずかに小さくなっているとする。 For example, as shown in FIG. 20(A), suppose that the distance corresponding to the peak of the histogram (first data) obtained by weak light emission when the subject is close is just slightly larger than the boundary between close and not close (for example, 1 m). Furthermore, as shown in FIG. 20(B), suppose that the peak of the histogram (second data) obtained by strong light emission when the subject is close is just slightly smaller than the boundary between close and not close (for example, 1 m) due to the pile-up phenomenon.
 このとき、第1のデータおよび第2のデータからは、被写体までの距離を導出することができない。このような問題を回避するため、信頼度計算部180は、第1のデータの信頼度C(=C1)や、第2のデータの信頼度C(=C2)を算出することにより、第1のデータおよび第2のデータが正しい測定により得られたものとみなすことができるか否かを判断する。 At this time, the distance to the subject cannot be derived from the first data and the second data. To avoid such problems, the reliability calculation unit 180 calculates the reliability C (= C1) of the first data and the reliability C (= C2) of the second data to determine whether the first data and the second data can be regarded as having been obtained by correct measurement.
 具体的には、まず、測距装置100は、1フレーム期間内において強発光および弱発光を切り替える駆動を行い、その際に受光した光に基づいて各受光画素Pb(受光素子)のヒストグラム(第1のデータ、第2のデータ)を取得する(強弱発光測距、ステップS201)。次に、信頼度計算部180は、ヒストグラム(第1のデータ)に含まれるピーク値とノイズ値との差分(第1のS/N比)を算出し、算出した第1のS/N比に基づいて信頼度C(=C1)を計算する(ステップS202)。 Specifically, first, the distance measuring device 100 performs driving that switches between strong and weak light emission within one frame period, and acquires a histogram (first data, second data) of each light receiving pixel Pb (light receiving element) based on the light received at that time (strong and weak light emission distance measuring, step S201). Next, the reliability calculation unit 180 calculates the difference (first S/N ratio) between the peak value and the noise value contained in the histogram (first data), and calculates the reliability C (= C1) based on the calculated first S/N ratio (step S202).
 信頼度計算部180は、第1のS/N比が所定の閾値TH1よりも大きい場合(ステップS203:Y)、第1のデータが信頼に値するデータであると判断して、信頼度C(=C1)を出力する(ステップS205)。信頼度計算部180は、第1のS/N比が所定の閾値TH1以下となっている場合(ステップS203:N)、第1のデータが信頼に値するデータではないと判断する。このとき、信頼度計算部180は、ヒストグラム(第2のデータ)に含まれるピーク値とノイズ値との差分(第2のS/N比)を算出し、算出した第2のS/N比に基づいて信頼度C(=C2)を計算する(ステップS204)。信頼度計算部180は、第2のS/N比が所定の閾値TH2よりも大きい場合、第1のデータが信頼に値するデータであると判断して、信頼度C(=C1)を出力する(ステップS205)。信頼度計算部180は、第2のS/N比が所定の閾値TH2以下となっている場合、第2のデータが信頼に値するデータではないと判断して、信頼度C(=C2)を出力する(ステップS205)。 If the first S/N ratio is greater than a predetermined threshold TH1 (step S203: Y), the reliability calculation unit 180 determines that the first data is reliable and outputs the reliability C (= C1) (step S205). If the first S/N ratio is equal to or less than the predetermined threshold TH1 (step S203: N), the reliability calculation unit 180 determines that the first data is not reliable. At this time, the reliability calculation unit 180 calculates the difference (second S/N ratio) between the peak value and the noise value included in the histogram (second data), and calculates the reliability C (= C2) based on the calculated second S/N ratio (step S204). If the second S/N ratio is greater than the predetermined threshold TH2, the reliability calculation unit 180 determines that the first data is reliable and outputs the reliability C (= C1) (step S205). If the second S/N ratio is equal to or less than a predetermined threshold TH2, the reliability calculation unit 180 determines that the second data is not reliable and outputs the reliability C (= C2) (step S205).
 通信部150は、第1のデータおよび第2のデータのうち少なくとも一方と、発光部110の発光強度についてのデータと、信頼度Cとを所定のフォーマットの送信データとして出力する。通信部150は、必要に応じて、第1のデータおよび第2のデータのうち少なくとも一方に対応する距離データと、発光部110の発光強度についてのデータと、信頼度Cとを所定のフォーマットの送信データとして出力してもよい。 The communication unit 150 outputs at least one of the first data and the second data, data on the light emission intensity of the light-emitting unit 110, and the reliability C as transmission data in a predetermined format. If necessary, the communication unit 150 may output distance data corresponding to at least one of the first data and the second data, data on the light emission intensity of the light-emitting unit 110, and the reliability C as transmission data in a predetermined format.
 本変形例では、信頼度C(スコア値)を含む送信データが出力される。これにより、第1のデータおよび第2のデータの少なくとも一方から、被写体までの距離を導出することができる。その結果、第1のデータおよび第2のデータにおけるピーク位置に起因して、被写体までの距離を導出することができなくなるのを避けることができる。 In this modified example, transmission data including the reliability C (score value) is output. This makes it possible to derive the distance to the subject from at least one of the first data and the second data. As a result, it is possible to avoid a situation in which the distance to the subject cannot be derived due to the peak positions in the first data and the second data.
<3.適用例>
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
<3. Application Examples>
The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure may be realized as a device mounted on any type of moving object, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, a robot, a construction machine, or an agricultural machine (tractor).
 図23は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システム7000の概略的な構成例を示すブロック図である。車両制御システム7000は、通信ネットワーク7010を介して接続された複数の電子制御ユニットを備える。図23に示した例では、車両制御システム7000は、駆動系制御ユニット7100、ボディ系制御ユニット7200、バッテリ制御ユニット7300、車外情報検出ユニット7400、車内情報検出ユニット7500、及び統合制御ユニット7600を備える。これらの複数の制御ユニットを接続する通信ネットワーク7010は、例えば、CAN(Controller Area Network)、LIN(Local Interconnect Network)、LAN(Local Area Network)又はFlexRay(登録商標)等の任意の規格に準拠した車載通信ネットワークであってよい。 23 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile control system to which the technology disclosed herein can be applied. The vehicle control system 7000 includes a plurality of electronic control units connected via a communication network 7010. In the example shown in FIG. 23, the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside vehicle information detection unit 7400, an inside vehicle information detection unit 7500, and an integrated control unit 7600. The communication network 7010 connecting these multiple control units may be, for example, an in-vehicle communication network conforming to any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark).
 各制御ユニットは、各種プログラムにしたがって演算処理を行うマイクロコンピュータと、マイクロコンピュータにより実行されるプログラム又は各種演算に用いられるパラメータ等を記憶する記憶部と、各種制御対象の装置を駆動する駆動回路とを備える。各制御ユニットは、通信ネットワーク7010を介して他の制御ユニットとの間で通信を行うためのネットワークI/Fを備えるとともに、車内外の装置又はセンサ等との間で、有線通信又は無線通信により通信を行うための通信I/Fを備える。図23では、統合制御ユニット7600の機能構成として、マイクロコンピュータ7610、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660、音声画像出力部7670、車載ネットワークI/F7680及び記憶部7690が図示されている。他の制御ユニットも同様に、マイクロコンピュータ、通信I/F及び記憶部等を備える。 Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores the programs executed by the microcomputer or parameters used in various calculations, and a drive circuit that drives various devices to be controlled. Each control unit includes a network I/F for communicating with other control units via a communication network 7010, and a communication I/F for communicating with devices or sensors inside and outside the vehicle by wired or wireless communication. In FIG. 23, the functional configuration of the integrated control unit 7600 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning unit 7640, a beacon receiving unit 7650, an in-vehicle device I/F 7660, an audio/image output unit 7670, an in-vehicle network I/F 7680, and a storage unit 7690. Other control units also include a microcomputer, a communication I/F, a storage unit, and the like.
 駆動系制御ユニット7100は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット7100は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。駆動系制御ユニット7100は、ABS(Antilock Brake System)又はESC(Electronic Stability Control)等の制御装置としての機能を有してもよい。 The drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 7100 functions as a control device for a drive force generating device for generating a drive force for the vehicle, such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle of the vehicle, and a braking device for generating a braking force for the vehicle. The drive system control unit 7100 may also function as a control device such as an ABS (Antilock Brake System) or ESC (Electronic Stability Control).
 駆動系制御ユニット7100には、車両状態検出部7110が接続される。車両状態検出部7110には、例えば、車体の軸回転運動の角速度を検出するジャイロセンサ、車両の加速度を検出する加速度センサ、あるいは、アクセルペダルの操作量、ブレーキペダルの操作量、ステアリングホイールの操舵角、エンジン回転数又は車輪の回転速度等を検出するためのセンサのうちの少なくとも一つが含まれる。駆動系制御ユニット7100は、車両状態検出部7110から入力される信号を用いて演算処理を行い、内燃機関、駆動用モータ、電動パワーステアリング装置又はブレーキ装置等を制御する。 The drive system control unit 7100 is connected to a vehicle state detection unit 7110. The vehicle state detection unit 7110 includes at least one of the following: a gyro sensor that detects the angular velocity of the axial rotational motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, or a sensor for detecting the amount of operation of the accelerator pedal, the amount of operation of the brake pedal, the steering angle of the steering wheel, the engine speed, or the rotation speed of the wheels. The drive system control unit 7100 performs arithmetic processing using the signal input from the vehicle state detection unit 7110, and controls the internal combustion engine, the drive motor, the electric power steering device, the brake device, etc.
 ボディ系制御ユニット7200は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット7200は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット7200には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット7200は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 7200 controls the operation of various devices installed in the vehicle body according to various programs. For example, the body system control unit 7200 functions as a control device for a keyless entry system, a smart key system, a power window device, or various lamps such as headlamps, tail lamps, brake lamps, turn signals, and fog lamps. In this case, radio waves or signals from various switches transmitted from a portable device that replaces a key can be input to the body system control unit 7200. The body system control unit 7200 accepts the input of these radio waves or signals and controls the vehicle's door lock device, power window device, lamps, etc.
 バッテリ制御ユニット7300は、各種プログラムにしたがって駆動用モータの電力供給源である二次電池7310を制御する。例えば、バッテリ制御ユニット7300には、二次電池7310を備えたバッテリ装置から、バッテリ温度、バッテリ出力電圧又はバッテリの残存容量等の情報が入力される。バッテリ制御ユニット7300は、これらの信号を用いて演算処理を行い、二次電池7310の温度調節制御又はバッテリ装置に備えられた冷却装置等の制御を行う。 The battery control unit 7300 controls the secondary battery 7310, which is the power supply source for the drive motor, according to various programs. For example, information such as the battery temperature, battery output voltage, or remaining capacity of the battery is input to the battery control unit 7300 from a battery device equipped with the secondary battery 7310. The battery control unit 7300 performs calculations using these signals, and controls the temperature regulation of the secondary battery 7310 or a cooling device or the like equipped in the battery device.
 車外情報検出ユニット7400は、車両制御システム7000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット7400には、撮像部7410及び車外情報検出部7420のうちの少なくとも一方が接続される。撮像部7410には、ToF(Time Of Flight)カメラ、ステレオカメラ、単眼カメラ、赤外線カメラ及びその他のカメラのうちの少なくとも一つが含まれる。車外情報検出部7420には、例えば、現在の天候又は気象を検出するための環境センサ、あるいは、車両制御システム7000を搭載した車両の周囲の他の車両、障害物又は歩行者等を検出するための周囲情報検出センサのうちの少なくとも一つが含まれる。 The outside vehicle information detection unit 7400 detects information outside the vehicle equipped with the vehicle control system 7000. For example, at least one of the imaging unit 7410 and the outside vehicle information detection unit 7420 is connected to the outside vehicle information detection unit 7400. The imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. The outside vehicle information detection unit 7420 includes at least one of an environmental sensor for detecting the current weather or climate, or a surrounding information detection sensor for detecting other vehicles, obstacles, pedestrians, etc., around the vehicle equipped with the vehicle control system 7000.
 環境センサは、例えば、雨天を検出する雨滴センサ、霧を検出する霧センサ、日照度合いを検出する日照センサ、及び降雪を検出する雪センサのうちの少なくとも一つであってよい。周囲情報検出センサは、超音波センサ、レーダ装置及びLIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)装置のうちの少なくとも一つであってよい。これらの撮像部7410及び車外情報検出部7420は、それぞれ独立したセンサないし装置として備えられてもよいし、複数のセンサないし装置が統合された装置として備えられてもよい。 The environmental sensor may be, for example, at least one of a raindrop sensor that detects rain, a fog sensor that detects fog, a sunshine sensor that detects the level of sunlight, and a snow sensor that detects snowfall. The surrounding information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device. The imaging unit 7410 and the outside vehicle information detection unit 7420 may each be provided as an independent sensor or device, or may be provided as a device in which multiple sensors or devices are integrated.
 ここで、図24は、撮像部7410及び車外情報検出部7420の設置位置の例を示す。撮像部7910,7912,7914,7916,7918は、例えば、車両7900のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部のうちの少なくとも一つの位置に設けられる。フロントノーズに備えられる撮像部7910及び車室内のフロントガラスの上部に備えられる撮像部7918は、主として車両7900の前方の画像を取得する。サイドミラーに備えられる撮像部7912,7914は、主として車両7900の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部7916は、主として車両7900の後方の画像を取得する。車室内のフロントガラスの上部に備えられる撮像部7918は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 24 shows an example of the installation positions of the imaging unit 7410 and the outside vehicle information detection unit 7420. The imaging units 7910, 7912, 7914, 7916, and 7918 are provided, for example, at least one of the front nose, side mirrors, rear bumper, back door, and upper part of the windshield inside the vehicle cabin of the vehicle 7900. The imaging unit 7910 provided on the front nose and the imaging unit 7918 provided on the upper part of the windshield inside the vehicle cabin mainly acquire images of the front of the vehicle 7900. The imaging units 7912 and 7914 provided on the side mirrors mainly acquire images of the sides of the vehicle 7900. The imaging unit 7916 provided on the rear bumper or back door mainly acquires images of the rear of the vehicle 7900. The imaging unit 7918 provided on the upper part of the windshield inside the vehicle cabin is mainly used to detect leading vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, etc.
 なお、図24には、それぞれの撮像部7910,7912,7914,7916の撮影範囲の一例が示されている。撮像範囲aは、フロントノーズに設けられた撮像部7910の撮像範囲を示し、撮像範囲b,cは、それぞれサイドミラーに設けられた撮像部7912,7914の撮像範囲を示し、撮像範囲dは、リアバンパ又はバックドアに設けられた撮像部7916の撮像範囲を示す。例えば、撮像部7910,7912,7914,7916で撮像された画像データが重ね合わせられることにより、車両7900を上方から見た俯瞰画像が得られる。 Note that FIG. 24 shows an example of the imaging ranges of the imaging units 7910, 7912, 7914, and 7916. Imaging range a indicates the imaging range of the imaging unit 7910 provided on the front nose, imaging ranges b and c indicate the imaging ranges of the imaging units 7912 and 7914 provided on the side mirrors, respectively, and imaging range d indicates the imaging range of the imaging unit 7916 provided on the rear bumper or back door. For example, an overhead image of the vehicle 7900 viewed from above is obtained by superimposing the image data captured by the imaging units 7910, 7912, 7914, and 7916.
 車両7900のフロント、リア、サイド、コーナ及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7922,7924,7926,7928,7930は、例えば超音波センサ又はレーダ装置であってよい。車両7900のフロントノーズ、リアバンパ、バックドア及び車室内のフロントガラスの上部に設けられる車外情報検出部7920,7926,7930は、例えばLIDAR装置であってよい。これらの車外情報検出部7920~7930は、主として先行車両、歩行者又は障害物等の検出に用いられる。 External information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on the front, rear, sides, corners, and upper part of the windshield inside the vehicle 7900 may be, for example, ultrasonic sensors or radar devices. External information detection units 7920, 7926, and 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield inside the vehicle 7900 may be, for example, LIDAR devices. These external information detection units 7920 to 7930 are mainly used to detect preceding vehicles, pedestrians, obstacles, etc.
 図23に戻って説明を続ける。車外情報検出ユニット7400は、撮像部7410に車外の画像を撮像させるとともに、撮像された画像データを受信する。また、車外情報検出ユニット7400は、接続されている車外情報検出部7420から検出情報を受信する。車外情報検出部7420が超音波センサ、レーダ装置又はLIDAR装置である場合には、車外情報検出ユニット7400は、超音波又は電磁波等を発信させるとともに、受信された反射波の情報を受信する。車外情報検出ユニット7400は、受信した情報に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、降雨、霧又は路面状況等を認識する環境認識処理を行ってもよい。車外情報検出ユニット7400は、受信した情報に基づいて、車外の物体までの距離を算出してもよい。 The explanation will be continued by returning to FIG. 23. The outside-vehicle information detection unit 7400 causes the imaging unit 7410 to capture an image outside the vehicle and receives the captured image data. The outside-vehicle information detection unit 7400 also receives detection information from the connected outside-vehicle information detection unit 7420. If the outside-vehicle information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the outside-vehicle information detection unit 7400 transmits ultrasonic waves or electromagnetic waves and receives information on the received reflected waves. The outside-vehicle information detection unit 7400 may perform object detection processing or distance detection processing for people, cars, obstacles, signs, or characters on the road surface based on the received information. The outside-vehicle information detection unit 7400 may perform environmental recognition processing for recognizing rainfall, fog, road surface conditions, etc. based on the received information. The outside-vehicle information detection unit 7400 may calculate the distance to an object outside the vehicle based on the received information.
 また、車外情報検出ユニット7400は、受信した画像データに基づいて、人、車、障害物、標識又は路面上の文字等を認識する画像認識処理又は距離検出処理を行ってもよい。車外情報検出ユニット7400は、受信した画像データに対して歪補正又は位置合わせ等の処理を行うとともに、異なる撮像部7410により撮像された画像データを合成して、俯瞰画像又はパノラマ画像を生成してもよい。車外情報検出ユニット7400は、異なる撮像部7410により撮像された画像データを用いて、視点変換処理を行ってもよい。 The outside vehicle information detection unit 7400 may also perform image recognition processing or distance detection processing to recognize people, cars, obstacles, signs, or characters on the road surface based on the received image data. The outside vehicle information detection unit 7400 may perform processing such as distortion correction or alignment on the received image data, and may also generate an overhead image or a panoramic image by synthesizing image data captured by different imaging units 7410. The outside vehicle information detection unit 7400 may also perform viewpoint conversion processing using image data captured by different imaging units 7410.
 車内情報検出ユニット7500は、車内の情報を検出する。車内情報検出ユニット7500には、例えば、運転者の状態を検出する運転者状態検出部7510が接続される。運転者状態検出部7510は、運転者を撮像するカメラ、運転者の生体情報を検出する生体センサ又は車室内の音声を集音するマイク等を含んでもよい。生体センサは、例えば、座面又はステアリングホイール等に設けられ、座席に座った搭乗者又はステアリングホイールを握る運転者の生体情報を検出する。車内情報検出ユニット7500は、運転者状態検出部7510から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。車内情報検出ユニット7500は、集音された音声信号に対してノイズキャンセリング処理等の処理を行ってもよい。 The in-vehicle information detection unit 7500 detects information inside the vehicle. For example, a driver state detection unit 7510 that detects the state of the driver is connected to the in-vehicle information detection unit 7500. The driver state detection unit 7510 may include a camera that captures an image of the driver, a biosensor that detects the driver's biometric information, or a microphone that collects sound inside the vehicle. The biosensor is provided, for example, on the seat or steering wheel, and detects the biometric information of a passenger sitting in the seat or a driver gripping the steering wheel. The in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, or may determine whether the driver is dozing off. The in-vehicle information detection unit 7500 may perform processing such as noise canceling on the collected sound signal.
 統合制御ユニット7600は、各種プログラムにしたがって車両制御システム7000内の動作全般を制御する。統合制御ユニット7600には、入力部7800が接続されている。入力部7800は、例えば、タッチパネル、ボタン、マイクロフォン、スイッチ又はレバー等、搭乗者によって入力操作され得る装置によって実現される。統合制御ユニット7600には、マイクロフォンにより入力される音声を音声認識することにより得たデータが入力されてもよい。入力部7800は、例えば、赤外線又はその他の電波を利用したリモートコントロール装置であってもよいし、車両制御システム7000の操作に対応した携帯電話又はPDA(Personal Digital Assistant)等の外部接続機器であってもよい。入力部7800は、例えばカメラであってもよく、その場合搭乗者はジェスチャにより情報を入力することができる。あるいは、搭乗者が装着したウェアラブル装置の動きを検出することで得られたデータが入力されてもよい。さらに、入力部7800は、例えば、上記の入力部7800を用いて搭乗者等により入力された情報に基づいて入力信号を生成し、統合制御ユニット7600に出力する入力制御回路などを含んでもよい。搭乗者等は、この入力部7800を操作することにより、車両制御システム7000に対して各種のデータを入力したり処理動作を指示したりする。 The integrated control unit 7600 controls the overall operation of the vehicle control system 7000 according to various programs. The input unit 7800 is connected to the integrated control unit 7600. The input unit 7800 is realized by a device that can be operated by the passenger, such as a touch panel, a button, a microphone, a switch, or a lever. Data obtained by voice recognition of a voice input by a microphone may be input to the integrated control unit 7600. The input unit 7800 may be, for example, a remote control device using infrared or other radio waves, or an externally connected device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 7000. The input unit 7800 may be, for example, a camera, in which case the passenger can input information by gestures. Alternatively, data obtained by detecting the movement of a wearable device worn by the passenger may be input. Furthermore, the input unit 7800 may include, for example, an input control circuit that generates an input signal based on information input by the passenger using the above-mentioned input unit 7800 and outputs the input signal to the integrated control unit 7600. Passengers and others can operate the input unit 7800 to input various data and instruct processing operations to the vehicle control system 7000.
 記憶部7690は、マイクロコンピュータにより実行される各種プログラムを記憶するROM(Read Only Memory)、及び各種パラメータ、演算結果又はセンサ値等を記憶するRAM(Random Access Memory)を含んでいてもよい。また、記憶部7690は、HDD(Hard Disc Drive)等の磁気記憶デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等によって実現してもよい。 The memory unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, etc. The memory unit 7690 may also be realized by a magnetic memory device such as a HDD (Hard Disc Drive), a semiconductor memory device, an optical memory device, or a magneto-optical memory device, etc.
 汎用通信I/F7620は、外部環境7750に存在する様々な機器との間の通信を仲介する汎用的な通信I/Fである。汎用通信I/F7620は、GSM(登録商標)(Global System of Mobile communications)、WiMAX(登録商標)、LTE(登録商標)(Long Term Evolution)若しくはLTE-A(LTE-Advanced)などのセルラー通信プロトコル、又は無線LAN(Wi-Fi(登録商標)ともいう)、Bluetooth(登録商標)などのその他の無線通信プロトコルを実装してよい。汎用通信I/F7620は、例えば、基地局又はアクセスポイントを介して、外部ネットワーク(例えば、インターネット、クラウドネットワーク又は事業者固有のネットワーク)上に存在する機器(例えば、アプリケーションサーバ又は制御サーバ)へ接続してもよい。また、汎用通信I/F7620は、例えばP2P(Peer To Peer)技術を用いて、車両の近傍に存在する端末(例えば、運転者、歩行者若しくは店舗の端末、又はMTC(Machine Type Communication)端末)と接続してもよい。 The general-purpose communication I/F 7620 is a general-purpose communication I/F that mediates communication between various devices present in the external environment 7750. The general-purpose communication I/F 7620 may implement cellular communication protocols such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution) or LTE-A (LTE-Advanced), or other wireless communication protocols such as wireless LAN (also called Wi-Fi (registered trademark)) and Bluetooth (registered trademark). The general-purpose communication I/F 7620 may connect to devices (e.g., application servers or control servers) present on an external network (e.g., the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point. In addition, the general-purpose communication I/F 7620 may connect to a terminal located near the vehicle (e.g., a driver's, pedestrian's, or store's terminal, or an MTC (Machine Type Communication) terminal) using, for example, P2P (Peer To Peer) technology.
 専用通信I/F7630は、車両における使用を目的として策定された通信プロトコルをサポートする通信I/Fである。専用通信I/F7630は、例えば、下位レイヤのIEEE802.11pと上位レイヤのIEEE1609との組合せであるWAVE(Wireless Access in Vehicle Environment)、DSRC(Dedicated Short Range Communications)、又はセルラー通信プロトコルといった標準プロトコルを実装してよい。専用通信I/F7630は、典型的には、車車間(Vehicle to Vehicle)通信、路車間(Vehicle to Infrastructure)通信、車両と家との間(Vehicle to Home)の通信及び歩車間(Vehicle to Pedestrian)通信のうちの1つ以上を含む概念であるV2X通信を遂行する。 The dedicated communication I/F 7630 is a communication I/F that supports a communication protocol developed for use in a vehicle. The dedicated communication I/F 7630 may implement a standard protocol such as WAVE (Wireless Access in Vehicle Environment), DSRC (Dedicated Short Range Communications), or a cellular communication protocol, which is a combination of the lower layer IEEE 802.11p and the higher layer IEEE 1609. The dedicated communication I/F 7630 typically performs V2X communication, which is a concept that includes one or more of vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication.
 測位部7640は、例えば、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して測位を実行し、車両の緯度、経度及び高度を含む位置情報を生成する。なお、測位部7640は、無線アクセスポイントとの信号の交換により現在位置を特定してもよく、又は測位機能を有する携帯電話、PHS若しくはスマートフォンといった端末から位置情報を取得してもよい。 The positioning unit 7640 performs positioning by receiving, for example, GNSS signals from GNSS (Global Navigation Satellite System) satellites (for example, GPS signals from GPS (Global Positioning System) satellites), and generates position information including the latitude, longitude, and altitude of the vehicle. The positioning unit 7640 may determine the current position by exchanging signals with a wireless access point, or may obtain position information from a terminal such as a mobile phone, PHS, or smartphone that has a positioning function.
 ビーコン受信部7650は、例えば、道路上に設置された無線局等から発信される電波あるいは電磁波を受信し、現在位置、渋滞、通行止め又は所要時間等の情報を取得する。なお、ビーコン受信部7650の機能は、上述した専用通信I/F7630に含まれてもよい。 The beacon receiver 7650 receives, for example, radio waves or electromagnetic waves transmitted from radio stations installed on the road, and acquires information such as the current location, congestion, road closures, and travel time. The functions of the beacon receiver 7650 may be included in the dedicated communication I/F 7630 described above.
 車内機器I/F7660は、マイクロコンピュータ7610と車内に存在する様々な車内機器7760との間の接続を仲介する通信インタフェースである。車内機器I/F7660は、無線LAN、Bluetooth(登録商標)、NFC(Near Field Communication)又はWUSB(Wireless USB)といった無線通信プロトコルを用いて無線接続を確立してもよい。また、車内機器I/F7660は、図示しない接続端子(及び、必要であればケーブル)を介して、USB(Universal Serial Bus)、HDMI(登録商標)(High-Definition Multimedia Interface、又はMHL(Mobile High-definition Link)等の有線接続を確立してもよい。車内機器7760は、例えば、搭乗者が有するモバイル機器若しくはウェアラブル機器、又は車両に搬入され若しくは取り付けられる情報機器のうちの少なくとも1つを含んでいてもよい。また、車内機器7760は、任意の目的地までの経路探索を行うナビゲーション装置を含んでいてもよい。車内機器I/F7660は、これらの車内機器7760との間で、制御信号又はデータ信号を交換する。 The in-vehicle device I/F 7660 is a communication interface that mediates the connection between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle. The in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB). The in-vehicle device I/F 7660 may also establish a wired connection such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), or MHL (Mobile High-definition Link) via a connection terminal (and a cable, if necessary) not shown. The in-vehicle device 7760 may include, for example, at least one of a mobile device or wearable device owned by a passenger, or an information device carried into or attached to the vehicle. The in-vehicle device 7760 may also include a navigation device that searches for a route to an arbitrary destination. The in-vehicle device I/F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
 車載ネットワークI/F7680は、マイクロコンピュータ7610と通信ネットワーク7010との間の通信を仲介するインタフェースである。車載ネットワークI/F7680は、通信ネットワーク7010によりサポートされる所定のプロトコルに則して、信号等を送受信する。 The in-vehicle network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The in-vehicle network I/F 7680 transmits and receives signals in accordance with a specific protocol supported by the communication network 7010.
 統合制御ユニット7600のマイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、各種プログラムにしたがって、車両制御システム7000を制御する。例えば、マイクロコンピュータ7610は、取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット7100に対して制御指令を出力してもよい。例えば、マイクロコンピュータ7610は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行ってもよい。また、マイクロコンピュータ7610は、取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行ってもよい。 The microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 according to various programs based on information acquired through at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle device I/F 7660, and the in-vehicle network I/F 7680. For example, the microcomputer 7610 may calculate the control target value of the driving force generating device, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and output a control command to the drive system control unit 7100. For example, the microcomputer 7610 may perform cooperative control for the purpose of realizing the functions of an ADAS (Advanced Driver Assistance System), including vehicle collision avoidance or impact mitigation, following driving based on the distance between vehicles, vehicle speed maintenance driving, vehicle collision warning, vehicle lane departure warning, etc. In addition, the microcomputer 7610 may control the driving force generating device, steering mechanism, braking device, etc. based on the acquired information about the surroundings of the vehicle, thereby performing cooperative control for the purpose of automatic driving, which allows the vehicle to travel autonomously without relying on the driver's operation.
 マイクロコンピュータ7610は、汎用通信I/F7620、専用通信I/F7630、測位部7640、ビーコン受信部7650、車内機器I/F7660及び車載ネットワークI/F7680のうちの少なくとも一つを介して取得される情報に基づき、車両と周辺の構造物や人物等の物体との間の3次元距離情報を生成し、車両の現在位置の周辺情報を含むローカル地図情報を作成してもよい。また、マイクロコンピュータ7610は、取得される情報に基づき、車両の衝突、歩行者等の近接又は通行止めの道路への進入等の危険を予測し、警告用信号を生成してもよい。警告用信号は、例えば、警告音を発生させたり、警告ランプを点灯させたりするための信号であってよい。 The microcomputer 7610 may generate three-dimensional distance information between the vehicle and objects such as surrounding structures and people based on information acquired via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning unit 7640, the beacon receiving unit 7650, the in-vehicle equipment I/F 7660, and the in-vehicle network I/F 7680, and may create local map information including information about the surroundings of the vehicle's current position. The microcomputer 7610 may also predict dangers such as vehicle collisions, the approach of pedestrians, or entry into closed roads based on the acquired information, and generate warning signals. The warning signals may be, for example, signals for generating warning sounds or turning on warning lights.
 音声画像出力部7670は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図23の例では、出力装置として、オーディオスピーカ7710、表示部7720及びインストルメントパネル7730が例示されている。表示部7720は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。表示部7720は、AR(Augmented Reality)表示機能を有していてもよい。出力装置は、これらの装置以外の、ヘッドホン、搭乗者が装着する眼鏡型ディスプレイ等のウェアラブルデバイス、プロジェクタ又はランプ等の他の装置であってもよい。出力装置が表示装置の場合、表示装置は、マイクロコンピュータ7610が行った各種処理により得られた結果又は他の制御ユニットから受信された情報を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。また、出力装置が音声出力装置の場合、音声出力装置は、再生された音声データ又は音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。 The audio/image output unit 7670 transmits at least one of audio and image output signals to an output device capable of visually or audibly notifying the passengers of the vehicle or the outside of the vehicle of information. In the example of FIG. 23, an audio speaker 7710, a display unit 7720, and an instrument panel 7730 are illustrated as output devices. The display unit 7720 may include, for example, at least one of an on-board display and a head-up display. The display unit 7720 may have an AR (Augmented Reality) display function. The output device may be other devices such as headphones, a wearable device such as a glasses-type display worn by the passenger, a projector, or a lamp, in addition to these devices. When the output device is a display device, the display device visually displays the results obtained by various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, graphs, etc. When the output device is an audio output device, the audio output device converts an audio signal consisting of reproduced audio data or acoustic data into an analog signal and audibly outputs it.
 なお、図23に示した例において、通信ネットワーク7010を介して接続された少なくとも二つの制御ユニットが一つの制御ユニットとして一体化されてもよい。あるいは、個々の制御ユニットが、複数の制御ユニットにより構成されてもよい。さらに、車両制御システム7000が、図示されていない別の制御ユニットを備えてもよい。また、上記の説明において、いずれかの制御ユニットが担う機能の一部又は全部を、他の制御ユニットに持たせてもよい。つまり、通信ネットワーク7010を介して情報の送受信がされるようになっていれば、所定の演算処理が、いずれかの制御ユニットで行われるようになってもよい。同様に、いずれかの制御ユニットに接続されているセンサ又は装置が、他の制御ユニットに接続されるとともに、複数の制御ユニットが、通信ネットワーク7010を介して相互に検出情報を送受信してもよい。 23, at least two control units connected via the communication network 7010 may be integrated into one control unit. Alternatively, each control unit may be composed of multiple control units. Furthermore, the vehicle control system 7000 may include another control unit not shown. In the above description, some or all of the functions performed by any control unit may be provided by another control unit. In other words, as long as information is transmitted and received via the communication network 7010, a predetermined calculation process may be performed by any control unit. Similarly, a sensor or device connected to any control unit may be connected to another control unit, and multiple control units may transmit and receive detection information to each other via the communication network 7010.
 なお、上述の測距装置100および情報処理装置200の各機能を実現するためのコンピュータプログラムを、いずれかの制御ユニット等に実装することができる。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体を提供することもできる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。 Note that computer programs for implementing the functions of the distance measuring device 100 and the information processing device 200 described above can be implemented in any of the control units, etc. Also, a computer-readable recording medium on which such a computer program is stored can also be provided. The recording medium can be, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, etc. Also, the above computer program can be distributed, for example, via a network, without using a recording medium.
 以上説明した車両制御システム7000において、上述の測距装置100および情報処理装置200は、例えば,環境センサとしてのLIDARの光源ステアリング部として用いることができる。また,撮像部における画像認識を、上述の測距装置100および情報処理装置200を用いた光コンピューティングユニットで行うこともできる。 In the vehicle control system 7000 described above, the above-mentioned distance measuring device 100 and information processing device 200 can be used, for example, as a light source steering unit of a LIDAR as an environmental sensor. In addition, image recognition in the imaging unit can be performed by an optical computing unit using the above-mentioned distance measuring device 100 and information processing device 200.
 また、上述の測距装置100および情報処理装置200の少なくとも一部の構成要素は、図23に示した統合制御ユニット7600のためのモジュール(例えば、一つのダイで構成される集積回路モジュール)において実現されてもよい。あるいは、上述の測距装置100および情報処理装置200が、図23に示した車両制御システム7000の複数の制御ユニットによって実現されてもよい。 Furthermore, at least some of the components of the distance measuring device 100 and the information processing device 200 described above may be realized in a module (e.g., an integrated circuit module configured on a single die) for the integrated control unit 7600 shown in FIG. 23. Alternatively, the distance measuring device 100 and the information processing device 200 described above may be realized by multiple control units of the vehicle control system 7000 shown in FIG. 23.
 以上、実施の形態およびその変形例、ならびに適用例を挙げて本開示を説明したが、本開示は上記実施の形態等に限定されるものではなく、種々変形が可能である。なお、本明細書中に記載された効果は、あくまで例示である。本開示の効果は、本明細書中に記載された効果に限定されるものではない。本開示が、本明細書中に記載された効果以外の効果を持っていてもよい。 The present disclosure has been described above by giving embodiments, their modifications, and application examples, but the present disclosure is not limited to the above-described embodiments, and various modifications are possible. Note that the effects described in this specification are merely examples. The effects of the present disclosure are not limited to the effects described in this specification. The present disclosure may have effects other than those described in this specification.
 また、例えば、本開示は以下のような構成を取ることができる。
(1)
 照射光を出射する発光部と、
 前記照射光の被写体への照射によって得られる入射光に基づいて前記被写体までの距離に関するデータを算出する測距部と
 を備え、
 前記発光部は、アレイ状に配置された複数の発光素子と、前記複数の発光素子を1フレーム期間内に所定の単位グループごとに順次発光させる第1の駆動部とを有し、
 前記第1の駆動部は、前記1フレーム期間内において、前記単位グループ内の前記複数の発光素子を第1の発光強度で発光させた後前記第1の発光強度とは異なる第2の発光強度で発光させる
 測距装置。
(2)
 前記測距部は、前記複数の発光素子を前記第1の発光強度で発光させたときに得られた前記被写体までの距離に関する第1のデータと、前記複数の発光素子を前記第2の発光強度で発光させたときに得られた前記被写体までの距離に関する第2のデータとを算出する
 (1)に記載の測距装置。
(3)
 前記測距部で算出された前記第1のデータおよび前記第2のデータのうち少なくとも一方と、前記発光部の発光強度についてのデータとを所定のフォーマットの出力データとして出力する出力部を更に備えた
 (2)に記載の測距装置。
(4)
 駆動モードを設定する制御信号を取得する取得部を更に備え、
 前記第1の駆動部は、前記取得部で取得した前記駆動モードに応じて前記第1の発光強度および前記第2の発光強度を設定する
 (1)ないし(3)のいずれか1つに記載の測距装置。
(5)
 前記第1の駆動部は、前記取得部で取得した前記駆動モードが、前記被写体が相対的に遠距離に位置することを想定したモードである場合に、前記1フレーム期間内において、前記単位グループ内の前記複数の発光素子を前記第1の発光強度で発光させた後前記第2の発光強度で発光させる
 (4)に記載の測距装置。
(6)
 前記入射光を受光する受光部を更に備え、
 前記受光部は、アレイ状に配置された複数の受光素子と、前記複数の受光素子のうち、少なくとも一部をアクティブにすることにより、アクティブになっている前記複数の受光素子から受光データを出力させる第2の駆動部とを有し、
 前記第2の駆動部は、前記取得部で取得した前記駆動モードに応じてアクティブにする前記複数の受光素子を決定する
 (4)または(5)に記載の測距装置。
(7)
 前記入射光を受光する受光部を更に備え、
 前記受光部は、アレイ状に配置された複数の受光素子と、前記複数の受光素子のうち、少なくとも一部をアクティブにすることにより、アクティブになっている前記複数の受光素子から受光データを出力させる第2の駆動部とを有し、
 前記受光素子は、SPAD(Single Photon Avalanche Diode)であり、
 前記測距部は、それぞれ1以上の前記受光素子を含む画素ごとに、前記発光素子を駆動してから前記1以上の受光素子への光の入射が検出されるまでの時間に関するヒストグラムを生成し、生成した前記ヒストグラム、もしくは生成した前記ヒストグラムに基づいて算出した距離情報を、前記被写体までの距離に関するデータとする
 (2)または(3)に記載の測距装置。
(8)
 前記測距部は、生成した前記ヒストグラムに基づいて前記ヒストグラムの信頼度を算出し、
 当該測距装置は、前記測距部で算出された前記第1のデータおよび前記第2のデータのうち少なくとも一方と、前記発光部の発光強度についてのデータと、前記信頼度とを所定のフォーマットの出力データとして出力する出力部を更に備えた
 (7)に記載の測距装置。
Furthermore, for example, the present disclosure can have the following configuration.
(1)
A light emitting unit that emits irradiation light;
a distance measuring unit that calculates data on a distance to the subject based on incident light obtained by irradiating the subject with the irradiation light,
the light-emitting section includes a plurality of light-emitting elements arranged in an array, and a first driving section that sequentially causes the plurality of light-emitting elements to emit light for each predetermined unit group within one frame period;
the first driving section causes the plurality of light-emitting elements in the unit group to emit light at a first emission intensity and then at a second emission intensity different from the first emission intensity during the one frame period.
(2)
The distance measuring device described in (1), wherein the distance measuring unit calculates first data regarding the distance to the subject obtained when the multiple light-emitting elements are made to emit light at the first emission intensity, and second data regarding the distance to the subject obtained when the multiple light-emitting elements are made to emit light at the second emission intensity.
(3)
The distance measuring device described in (2), further comprising an output unit that outputs at least one of the first data and the second data calculated by the distance measuring unit and data regarding the light emission intensity of the light emitting unit as output data in a predetermined format.
(4)
An acquisition unit that acquires a control signal for setting a drive mode,
The distance measuring device according to any one of (1) to (3), wherein the first driving unit sets the first emission intensity and the second emission intensity according to the driving mode acquired by the acquisition unit.
(5)
The distance measuring device described in (4), wherein when the drive mode acquired by the acquisition unit is a mode assuming that the subject is located at a relatively long distance, the first drive unit causes the multiple light-emitting elements in the unit group to emit light at the first emission intensity and then at the second emission intensity during the one frame period.
(6)
A light receiving unit that receives the incident light is further provided,
the light receiving unit has a plurality of light receiving elements arranged in an array, and a second drive unit that activates at least a portion of the plurality of light receiving elements to output light receiving data from the activated light receiving elements,
The distance measuring device according to (4) or (5), wherein the second driving unit determines which of the plurality of light receiving elements is to be activated in accordance with the driving mode acquired by the acquisition unit.
(7)
A light receiving unit that receives the incident light is further provided,
the light receiving unit has a plurality of light receiving elements arranged in an array, and a second drive unit that activates at least a portion of the plurality of light receiving elements to output light receiving data from the activated light receiving elements,
The light receiving element is a single photon avalanche diode (SPAD),
The distance measuring unit generates a histogram for each pixel that includes one or more of the light receiving elements, regarding the time from when the light emitting element is driven to when light is detected as being incident on the one or more light receiving elements, and uses the generated histogram or distance information calculated based on the generated histogram as data regarding the distance to the subject.
(8)
The distance measurement unit calculates a reliability of the histogram based on the generated histogram,
The ranging device described in (7) further includes an output unit that outputs at least one of the first data and the second data calculated by the ranging unit, data on the light emission intensity of the light emission unit, and the reliability as output data in a predetermined format.
 本開示の一側面に係る測距装置では、1フレーム期間内において、単位グループ内の複数の発光素子が第1の発光強度で発光された後、第1の発光強度とは異なる第2の発光強度で発光される。これにより、例えば、被写体が近距離にある場合であっても、Pile-Up現象に起因する測定誤差が少なく、カバーガラスでの反射によるデッドタイムの影響のほとんど無いデータを得ることができる。また、例えば、被写体が遠距離にある場合であっても、被写体までの距離を算出することの可能なデータを得ることができる。その結果、単一強度で発光する場合と比べて、特に近距離における測距精度を向上させることができる。 In a distance measuring device according to one aspect of the present disclosure, during one frame period, multiple light emitting elements in a unit group emit light at a first emission intensity, and then emit light at a second emission intensity different from the first emission intensity. This makes it possible to obtain data with little measurement error caused by the Pile-Up phenomenon and with almost no effect of dead time caused by reflection on the cover glass, even when the subject is at a close distance. Furthermore, for example, even when the subject is at a long distance, it is possible to obtain data that allows the distance to the subject to be calculated. As a result, it is possible to improve distance measurement accuracy, particularly at close distances, compared to when light is emitted at a single intensity.
 当業者であれば、設計上の要件や他の要因に応じて、種々の修正、コンビネーション、サブコンビネーション、および変更を想到し得るが、それらは添付の請求の範囲やその均等物の範囲に含まれるものであることが理解される。 Those skilled in the art may conceive of various modifications, combinations, subcombinations, and variations depending on design requirements and other factors, and it is understood that these are within the scope of the appended claims and their equivalents.
 本出願は、日本国特許庁において2022年10月31日に出願された日本特許出願番号第2022-174231号を基礎として優先権を主張するものであり、この出願のすべての内容を参照によって本出願に援用する。 This application claims priority based on Japanese Patent Application No. 2022-174231, filed on October 31, 2022 in the Japan Patent Office, the entire contents of which are incorporated herein by reference.
 当業者であれば、設計上の要件や他の要因に応じて、種々の修正、コンビネーション、サブコンビネーション、および変更を想到し得るが、それらは添付の請求の範囲やその均等物の範囲に含まれるものであることが理解される。 Those skilled in the art may conceive of various modifications, combinations, subcombinations, and variations depending on design requirements and other factors, and it is understood that these are within the scope of the appended claims and their equivalents.

Claims (8)

  1.  照射光を出射する発光部と、
     前記照射光の被写体への照射によって得られる入射光に基づいて前記被写体までの距離に関するデータを算出する測距部と
     を備え、
     前記発光部は、アレイ状に配置された複数の発光素子と、前記複数の発光素子を1フレーム期間内に所定の単位グループごとに順次発光させる第1の駆動部とを有し、
     前記第1の駆動部は、前記1フレーム期間内において、前記単位グループ内の前記複数の発光素子を第1の発光強度で発光させた後前記第1の発光強度とは異なる第2の発光強度で発光させる
     測距装置。
    A light emitting unit that emits irradiation light;
    a distance measuring unit that calculates data on a distance to the subject based on incident light obtained by irradiating the subject with the irradiation light,
    the light-emitting section includes a plurality of light-emitting elements arranged in an array, and a first driving section that sequentially causes the plurality of light-emitting elements to emit light for each predetermined unit group within one frame period;
    the first driving section causes the plurality of light-emitting elements in the unit group to emit light at a first emission intensity and then at a second emission intensity different from the first emission intensity during the one frame period.
  2.  前記測距部は、前記複数の発光素子を前記第1の発光強度で発光させたときに得られた前記被写体までの距離に関する第1のデータと、前記複数の発光素子を前記第2の発光強度で発光させたときに得られた前記被写体までの距離に関する第2のデータとを算出する
     請求項1に記載の測距装置。
    The distance measuring device of claim 1 , wherein the distance measuring unit calculates first data regarding the distance to the subject obtained when the multiple light-emitting elements are made to emit light at the first emission intensity, and second data regarding the distance to the subject obtained when the multiple light-emitting elements are made to emit light at the second emission intensity.
  3.  前記測距部で算出された前記第1のデータおよび前記第2のデータのうち少なくとも一方と、前記発光部の発光強度についてのデータとを所定のフォーマットの出力データとして出力する出力部を更に備えた
     請求項2に記載の測距装置。
    The distance measuring device according to claim 2 , further comprising an output unit that outputs at least one of the first data and the second data calculated by the distance measuring unit and data regarding the light emission intensity of the light emitting unit as output data in a predetermined format.
  4.  駆動モードを設定する制御信号を取得する取得部を更に備え、
     前記第1の駆動部は、前記取得部で取得した前記駆動モードに応じて前記第1の発光強度および前記第2の発光強度を設定する
     請求項1に記載の測距装置。
    An acquisition unit that acquires a control signal for setting a drive mode,
    The distance measuring device according to claim 1 , wherein the first driving section sets the first emission intensity and the second emission intensity in accordance with the driving mode acquired by the acquisition section.
  5.  前記第1の駆動部は、前記取得部で取得した前記駆動モードが、前記被写体が相対的に遠距離に位置することを想定したモードである場合に、前記1フレーム期間内において、前記単位グループ内の前記複数の発光素子を前記第1の発光強度で発光させた後前記第2の発光強度で発光させる
     請求項4に記載の測距装置。
    5. The distance measuring device according to claim 4, wherein when the drive mode acquired by the acquisition unit is a mode assuming that the subject is located at a relatively long distance, the first drive unit causes the multiple light-emitting elements in the unit group to emit light at the first emission intensity and then at the second emission intensity during the one frame period.
  6.  前記入射光を受光する受光部を更に備え、
     前記受光部は、アレイ状に配置された複数の受光素子と、前記複数の受光素子のうち、少なくとも一部をアクティブにすることにより、アクティブになっている前記複数の受光素子から受光データを出力させる第2の駆動部とを有し、
     前記第2の駆動部は、前記取得部で取得した前記駆動モードに応じてアクティブにする前記複数の受光素子を決定する
     請求項4に記載の測距装置。
    A light receiving unit that receives the incident light is further provided,
    the light receiving unit has a plurality of light receiving elements arranged in an array, and a second drive unit that activates at least a portion of the plurality of light receiving elements to output light receiving data from the activated light receiving elements,
    The distance measuring device according to claim 4 , wherein the second driving section determines which of the plurality of light receiving elements is to be activated in accordance with the driving mode acquired by the acquisition section.
  7.  前記入射光を受光する受光部を更に備え、
     前記受光部は、アレイ状に配置された複数の受光素子と、前記複数の受光素子のうち、少なくとも一部をアクティブにすることにより、アクティブになっている前記複数の受光素子から受光データを出力させる第2の駆動部とを有し、
     前記受光素子は、SPAD(Single Photon Avalanche Diode)であり、
     前記測距部は、それぞれ1以上の前記受光素子を含む画素ごとに、前記発光素子を駆動してから前記1以上の受光素子への光の入射が検出されるまでの時間に関するヒストグラムを生成し、生成した前記ヒストグラム、もしくは生成した前記ヒストグラムに基づいて算出した距離情報を、前記被写体までの距離に関するデータとする
     請求項2に記載の測距装置。
    A light receiving unit that receives the incident light is further provided,
    the light receiving unit has a plurality of light receiving elements arranged in an array, and a second drive unit that activates at least a portion of the plurality of light receiving elements to output light receiving data from the activated light receiving elements,
    The light receiving element is a single photon avalanche diode (SPAD),
    The distance measuring device of claim 2, wherein the distance measuring unit generates a histogram for each pixel that includes one or more of the light receiving elements, regarding the time from when the light emitting element is driven to when incidence of light on the one or more light receiving elements is detected, and the generated histogram or distance information calculated based on the generated histogram is used as data regarding the distance to the subject.
  8.  前記測距部は、生成した前記ヒストグラムに基づいて前記ヒストグラムの信頼度を算出し、
     当該測距装置は、前記測距部で算出された前記第1のデータおよび前記第2のデータのうち少なくとも一方と、前記発光部の発光強度についてのデータと、前記信頼度とを所定のフォーマットの出力データとして出力する出力部を更に備えた
     請求項7に記載の測距装置。
    The distance measuring unit calculates a reliability of the histogram based on the generated histogram,
    The distance measuring device of claim 7, further comprising an output unit that outputs at least one of the first data and the second data calculated by the distance measuring unit, data regarding the light emission intensity of the light emitting unit, and the reliability as output data in a predetermined format.
PCT/JP2023/033884 2022-10-31 2023-09-19 Ranging device WO2024095626A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022174231 2022-10-31
JP2022-174231 2022-10-31

Publications (1)

Publication Number Publication Date
WO2024095626A1 true WO2024095626A1 (en) 2024-05-10

Family

ID=90930241

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/033884 WO2024095626A1 (en) 2022-10-31 2023-09-19 Ranging device

Country Status (1)

Country Link
WO (1) WO2024095626A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010091377A (en) * 2008-10-07 2010-04-22 Toyota Motor Corp Apparatus and method for optical distance measurement
US20200249318A1 (en) * 2019-01-31 2020-08-06 The University Court Of The University Of Edinburgh Strobe window dependent illumination for flash lidar
JP2021071478A (en) * 2019-10-25 2021-05-06 株式会社リコー Detector and method for detection
WO2021181868A1 (en) * 2020-03-09 2021-09-16 ソニーセミコンダクタソリューションズ株式会社 Distance sensor and distance measurement method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010091377A (en) * 2008-10-07 2010-04-22 Toyota Motor Corp Apparatus and method for optical distance measurement
US20200249318A1 (en) * 2019-01-31 2020-08-06 The University Court Of The University Of Edinburgh Strobe window dependent illumination for flash lidar
JP2021071478A (en) * 2019-10-25 2021-05-06 株式会社リコー Detector and method for detection
WO2021181868A1 (en) * 2020-03-09 2021-09-16 ソニーセミコンダクタソリューションズ株式会社 Distance sensor and distance measurement method

Similar Documents

Publication Publication Date Title
JP7246863B2 (en) Photodetector, vehicle control system and rangefinder
JPWO2018003227A1 (en) Ranging device and ranging method
TWI828695B (en) Light receiving device and distance measuring device
TW202030500A (en) Ranging device and ranging method
US20220003849A1 (en) Distance measuring device and distance measuring method
JP2021128084A (en) Ranging device and ranging method
US20220353440A1 (en) Light reception device, method of controlling light reception device, and distance measuring device
WO2021124762A1 (en) Light receiving device, method for controlling light receiving device, and distance measuring device
WO2021111766A1 (en) Light-receiving device, method for controlling light-receiving device, and ranging device
WO2021161858A1 (en) Rangefinder and rangefinding method
WO2024095626A1 (en) Ranging device
WO2022044686A1 (en) Apd sensor and distance measurement system
WO2021053958A1 (en) Light reception device, distance measurement device, and distance measurement device control method
WO2020153182A1 (en) Light detection device, method for driving light detection device, and ranging device
WO2024095625A1 (en) Rangefinder and rangefinding method
WO2023223928A1 (en) Distance measurement device and distance measurement system
WO2023162734A1 (en) Distance measurement device
WO2021161857A1 (en) Distance measurement device and distance measurement method
WO2023281825A1 (en) Light source device, distance measurement device, and distance measurement method
WO2023281824A1 (en) Light receiving device, distance measurment device, and light receiving device control method
WO2023162733A1 (en) Distance measuring device and distance measuring method
WO2022176532A1 (en) Light receiving device, ranging device, and signal processing method for light receiving device
WO2024057471A1 (en) Photoelectric conversion element, solid-state imaging element, and ranging system
WO2023190278A1 (en) Light detection device
JP7407734B2 (en) Photodetection device, control method for photodetection device, and distance measuring device