WO2022254839A1 - 光検出装置及び測距システム - Google Patents
光検出装置及び測距システム Download PDFInfo
- Publication number
- WO2022254839A1 WO2022254839A1 PCT/JP2022/008765 JP2022008765W WO2022254839A1 WO 2022254839 A1 WO2022254839 A1 WO 2022254839A1 JP 2022008765 W JP2022008765 W JP 2022008765W WO 2022254839 A1 WO2022254839 A1 WO 2022254839A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control
- signal
- unit
- failure
- light receiving
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 69
- 230000003287 optical effect Effects 0.000 title abstract description 7
- 238000012545 processing Methods 0.000 claims abstract description 85
- 239000011159 matrix material Substances 0.000 claims abstract description 10
- 239000000758 substrate Substances 0.000 claims description 23
- 238000000034 method Methods 0.000 abstract description 43
- 230000008569 process Effects 0.000 abstract description 13
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 abstract description 3
- 238000004891 communication Methods 0.000 description 71
- 238000010586 diagram Methods 0.000 description 27
- 238000005259 measurement Methods 0.000 description 17
- 239000004065 semiconductor Substances 0.000 description 16
- 238000012360 testing method Methods 0.000 description 15
- 238000006243 chemical reaction Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 230000033001 locomotion Effects 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 7
- 238000009825 accumulation Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 5
- 230000010391 action planning Effects 0.000 description 5
- 230000004927 fusion Effects 0.000 description 5
- 238000005070 sampling Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 239000010410 layer Substances 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 3
- 230000007257 malfunction Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 206010062519 Poor quality sleep Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- ALKZAGKDWUSJED-UHFFFAOYSA-N dinuclear copper ion Chemical compound [Cu].[Cu] ALKZAGKDWUSJED-UHFFFAOYSA-N 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000010030 laminating Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000001172 regenerating effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
- G01S7/4863—Detector arrays, e.g. charge-transfer gates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
- G01C3/085—Use of electric radiation detectors with electronic parallax measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4816—Constructional features, e.g. arrangements of optical elements of receivers alone
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/02—Details
- G01J1/0228—Control of working procedures; Failure detection; Spectral bandwidth calculation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4865—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/68—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to defects
- H04N25/683—Noise processing, e.g. detecting, correcting, reducing or removing noise applied to defects by defect estimation performed on the scene signal, e.g. real time or on the fly detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/705—Pixels for depth measurement, e.g. RGBZ
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/79—Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
- G01J1/44—Electric circuits
- G01J2001/4413—Type
- G01J2001/442—Single-photon detection or photon counting
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
- G01J1/44—Electric circuits
- G01J2001/4446—Type of detector
- G01J2001/446—Photodiode
- G01J2001/4466—Avalanche
Definitions
- the present disclosure relates to a photodetector and a ranging system.
- a SPAD Single Photon Avalanche Diode
- a single light (photon) is incident, and electrons (charges) generated by photoelectric conversion are multiplied in the PN junction region (avalanche amplification) to detect light with high accuracy. can be done.
- the distance can be measured with high accuracy by detecting the timing at which the current due to the multiplied electrons flows.
- JP 2020-143996 A Japanese Patent Application Laid-Open No. 2020-112528 Japanese Unexamined Patent Application Publication No. 2020-112501
- the range finder described above if there is a failure in the control line that electrically connects the control unit that controls each light receiving element, etc., the range finder is not properly controlled, and the light receiving element pixel signals may contain errors. Furthermore, in such a case, it will lead to malfunction of the system (automatic driving system etc.) using the distance measurement information by the distance measuring device. Therefore, there is a strong demand for a range finder to detect such failures in the control line as described above.
- the present disclosure proposes a photodetector and a ranging system capable of detecting failures in control lines.
- a light receiving section including a pixel array section composed of a plurality of light receiving elements arranged in a matrix, a control signal generating section for generating a control signal, and electrically connected to the light receiving section via a control line.
- a control unit for controlling the light receiving unit based on the control signal;
- a processing unit electrically connected to the light receiving unit via a signal line for processing an output signal from the light receiving unit; and detecting a failure.
- a failure determination unit wherein the failure determination unit detects a failure of the control line based on the output signal from the light receiving unit controlled based on a failure detection control signal having a predetermined pattern;
- an illumination device that emits irradiation light and a photodetector that receives reflected light of the irradiation light reflected by an object are included, and the photodetector includes a plurality of photodetectors arranged in a matrix.
- a light-receiving portion including a pixel array portion composed of light-receiving elements; a control signal generating portion for generating a control signal; a control unit that is electrically connected to the light receiving unit via a signal line and processes an output signal from the light receiving unit; and a failure determination unit that detects a failure, wherein the failure determination unit provides a distance measuring system that detects a failure of the control line from the output signal from the light receiving section controlled based on a failure detection control signal having a predetermined pattern.
- FIG. 4 is a diagram schematically illustrating ranging by a direct ToF method applicable to embodiments of the present disclosure
- FIG. 5 is a diagram showing an example histogram based on the time when a light receiving unit receives light, applicable to the embodiment of the present disclosure
- 1 is a block diagram showing an example of a configuration of a ranging system to which embodiments of the present disclosure can be applied
- FIG. 1 is a block diagram showing an example of a configuration of a distance measuring device to which embodiments of the present disclosure can be applied
- FIG. 1 is a circuit diagram showing an example of a configuration of a pixel circuit to which an embodiment of the present disclosure can be applied
- FIG. 4 is an explanatory diagram showing an example of connection between a plurality of pixel circuits to which the embodiment of the present disclosure can be applied;
- 1 is a schematic diagram showing an example of a layered structure of a distance measuring device to which an embodiment of the present disclosure can be applied;
- FIG. 1 is a block diagram showing a configuration of part of a distance measuring device according to an embodiment of the present disclosure;
- FIG. 1 is an explanatory diagram for explaining an outline of an embodiment of the present disclosure;
- FIG. 4 is a flow chart of a processing procedure according to an embodiment of the present disclosure;
- FIG. 4 is a flowchart for explaining an outline of a flow of SPAD access check according to an embodiment of the present disclosure;
- FIG. 4 is a table showing control signal patterns for fault checking of the I_SPAD line;
- FIG. 10 is an explanatory diagram for explaining details of control signal patterns for failure checking of the EN_VLINE line and the EN_AREA line;
- FIG. 10 is a table showing control signal patterns for fault checking of the ACT_SPAD_V line and the ACT_SPAD_H line;
- FIG. 10 is an explanatory diagram for explaining details of control signal patterns for failure check of the ACT_SPAD_V line and the ACT_SPAD_H line; 4 is a table showing control signal patterns for failure check of a downsampling circuit; FIG. 10 is an explanatory diagram for explaining details of a control signal pattern for failure check of the downsampling circuit; FIG. 10 is a diagram showing an example of a timing chart when a failure is detected; 1 is a block diagram showing a configuration example of a vehicle control system; FIG. FIG. 4 is a diagram showing an example of a sensing area;
- electrically connected refers to a connection in which electricity (signal) is conducted between a plurality of elements. means that in addition, "electrically connected” in the following description includes not only the case of directly and electrically connecting a plurality of elements, but also the case of indirectly and electrically connecting a plurality of elements through other elements. It also includes the case of connecting to
- the present disclosure relates to technology for distance measurement using light.
- a direct ToF (Time of Flight) method is applied as a ranging method.
- the direct ToF method the light emitted from the light source is reflected by the object to be measured, and the reflected light is received by a light-receiving element (specifically, a SPAD), and distance measurement is performed based on the time difference between the light emission timing and the light reception timing. It is a method to perform
- FIG. 1 is a diagram schematically showing distance measurement by the direct ToF method applicable to the embodiment of the present disclosure
- FIG. 3 shows an example histogram based on FIG.
- the distance measuring device 300 includes a light source section (illumination device) 301 and a light receiving section 302 .
- the light source unit 301 is, for example, a laser diode, and is driven to emit pulsed laser light. Light emitted from the light source unit 301 is reflected by the object to be measured (subject) 303 and received by the light receiving unit 302 as reflected light.
- the light receiving unit 302 includes a plurality of light receiving elements that convert light into electrical signals by photoelectric conversion, and can output pixel signals according to the received light.
- the time (light emission timing) at which the light source unit 301 emits light is time t 0
- the time (light reception timing) at which the light receiving unit 302 receives the light emitted from the light source unit 301 and reflected by the object 303 to be measured be time t1 .
- the constant c is the speed of light (2.9979 ⁇ 10 8 [m/sec]
- the distance D between the rangefinder 300 and the object 303 is given by the following equation (1).
- Range finder 300 can repeat the above process multiple times.
- the light receiving unit 302 may include a plurality of light receiving elements, and the distance D may be calculated based on each light receiving timing when the reflected light is received by each light receiving element.
- the distance measuring device 300 classifies the time tm (referred to as the light receiving time tm ) from the time t 0 of the light emission timing to the light receiving timing when the light is received by the light receiving unit 302 based on the class (bins), Generate a histogram.
- the light received by the light receiving unit 302 at the light receiving time tm is not limited to the light emitted by the light source unit 301 and reflected by the object 303 to be measured.
- ambient light around the distance measuring device 300 (light receiving unit 302 ) is also received by the light receiving unit 302 .
- a bin is obtained by classifying the light receiving time tm for each predetermined unit time d. Specifically, 0 ⁇ t m ⁇ d for bin #0, d ⁇ t m ⁇ 2 ⁇ d for bin #1, 2 ⁇ d ⁇ t m ⁇ 3 ⁇ d for bin #2, . ⁇ 2) becomes (N ⁇ 2) ⁇ d ⁇ t m ⁇ (N ⁇ 1) ⁇ d.
- the range finder 300 counts the number of times the light receiving time t m is acquired based on the bin to obtain the frequency 310 for each bin and generates a histogram.
- the light receiving unit 302 also receives light other than the reflected light that is the reflected light of the light emitted from the light source unit 301 .
- An example of such light other than the target reflected light is the ambient light described above.
- the portion indicated by the range 311 in the histogram contains ambient light components due to ambient light. Ambient light is light randomly incident on the light receiving unit 302 and becomes noise for the target reflected light.
- Reflected light of interest is light received according to a particular distance and appears as the active light component 312 in the histogram.
- a bin corresponding to the frequency of peaks in the active light component 312 is a bin corresponding to the distance D of the object 303 to be measured.
- Distance measuring device 300 acquires the representative time of the bin (for example, the time at the center of the bin) as time t1 described above, and calculates the distance D to object 303 according to equation (1) described above. be able to. By using a plurality of light reception results in this way, it is possible to perform appropriate distance measurement even if random noise occurs.
- FIG. 3 is a block diagram showing an example configuration of a ranging system 90 to which the embodiment of the present disclosure can be applied.
- the distance measurement system 90 can mainly include a light source section (illumination device) 400 , a distance measurement device 500 , a storage device 600 , a host 700 and an optical system 800 . Each block included in the distance measuring system 90 will be sequentially described below.
- the light source unit 400 corresponds to the light source unit 301 shown in FIG. 1 described above, is made up of a laser diode or the like, and is driven to emit, for example, pulsed laser light.
- the light source unit 400 can apply a VCSEL (Vertical Cavity Surface Emitting Laser) that emits laser light as a surface light source.
- VCSEL Vertical Cavity Surface Emitting Laser
- an array in which laser diodes are arranged in lines may be used as the light source unit 400, and a configuration in which laser light emitted from the laser diode array is scanned in a direction perpendicular to the lines may be used.
- a configuration in which a laser diode as a single light source is used as the light source unit 400 and the laser light emitted from the laser diode is scanned in the horizontal and vertical directions may be used.
- Distance measuring device 500 includes light receiving section 302 of FIG. 1 described above. Further, the light receiving section 302 has a pixel array section (not shown) composed of a plurality of light receiving elements arranged in a two-dimensional grid (matrix) (for example, 189 ⁇ 600). Details of the distance measuring device 500, the light receiving unit 302, and the like will be described later. Furthermore, the optical system 800 can guide external incident light to the pixel array section of the light receiving section 302 of the distance measuring device 500 .
- the distance measuring device 500 counts the number of acquisitions of time information (light receiving time t m ) indicating the timing at which light is received by the pixel array section within a predetermined time range, and obtains the frequency for each bin to obtain the above-described frequency. generates a histogram with Distance measuring device 500 then calculates distance D to object 303 based on the generated histogram. Information indicating the calculated distance D is stored in the storage device 600, for example.
- Host 700 can control the overall operation of ranging system 90 .
- the host 700 supplies a light emission trigger, which is a trigger for causing the light source unit 400 to emit light, to the distance measuring device 500 .
- the distance measuring device 500 causes the light source section 400 to emit light at the timing based on this light emission trigger, and stores the time t0 indicating the light emission timing.
- the host 700 may set a pattern for ranging to the ranging device 500 in response to an instruction from the outside, for example.
- FIG. 4 is a block diagram showing an example of a configuration of a distance measuring device 500 to which embodiments of the present disclosure can be applied.
- distance measuring device 500 includes light receiving section 502 including pixel array section 510, processing section 530, control section 570, light emission timing control section 580, and interface (I/F) 590. Mainly contains Each block included in the distance measuring device 500 will be described below.
- the light receiving section 502 includes a pixel array section 510 .
- the pixel array section 510 has a plurality of SPADs (light receiving elements) 512 arranged in a matrix (for example, arranged in 189 ⁇ 600).
- Each SPAD 512 is controlled by a control section 570 which will be described later.
- the control unit 570 can control readout of pixel signals from each SPAD 512 for each block including (p ⁇ q) SPADs 512, p in the row direction and q in the column direction.
- the control unit 570 can read out pixel signals from each SPAD 512 by scanning each SPAD 512 in the row direction and further in the column direction in units of blocks. Details of the light receiving unit 502 will be described later.
- processing unit 530 can process pixel signals read from each SPAD 512 via signal lines. As shown in FIG. 4 , processing section 530 includes conversion section 540 , generation section 550 , and signal processing section 560 .
- pixel signals read from each SPAD 512 and output from the pixel array section 510 are supplied to the conversion section 540 .
- the conversion section 540 converts the pixel signals supplied from the pixel array section 510 into digital information. Specifically, the conversion section 540 converts the pixel signal supplied from the pixel array section 510 into time information indicating the timing at which the SPAD 512 corresponding to the pixel signal receives light.
- the generation unit 550 generates a histogram based on the time information when the pixel signal is converted by the conversion unit 540.
- the signal processing unit 560 performs predetermined arithmetic processing based on the histogram data generated by the generating unit 550, and calculates distance information, for example.
- the signal processing unit 560 creates an approximate curve of the histogram based on the histogram data generated by the generating unit 550, for example.
- the signal processing unit 560 can detect the peak of the curve approximated by this histogram, and obtain the distance D (an example of distance measurement information) based on the detected peak.
- the signal processing unit 560 may perform filter processing on the curve approximated by the histogram when performing the curve approximation of the histogram. For example, the signal processing unit 560 can suppress noise components by applying low-pass filter processing to the histogram-approximated curve.
- the interface 590 functions as an output section that outputs the distance information supplied from the signal processing section 560 to the outside as output data.
- MIPI Mobile Industry Processor Interface
- the distance information is output to the outside via the interface 590 in the above description, it is not limited to this example. That is, the histogram data generated by the generation unit 550 may be output from the interface 590 to the outside. In this case, the histogram data output from the interface 590 is supplied to, for example, an external information processing device and processed as appropriate.
- control unit 570 can control the light receiving unit 502 and the like based on a control signal and a reference clock signal supplied from the outside, for example, according to a preinstalled program. Furthermore, as described above, the control unit 570 can control a predetermined area of the pixel array unit 510 as a target area and the SPAD 512 included in the target area as a target for reading pixel signals. In addition, the control unit 570 can collectively scan a plurality of rows (plurality of lines), further scan them in the column direction, and read pixel signals from each SPAD 512 .
- the light emission timing control section 580 generates a light emission control signal indicating light emission timing according to an externally supplied light emission trigger signal.
- the light emission control signal is supplied to the light source unit 400 and the processing unit 530 .
- the light receiving unit 502 described above is configured from a plurality of pixel circuits 900 each including a plurality of SPADs 512 . Therefore, an example of the configuration of a pixel circuit 900 to which the embodiments of the present disclosure can be applied will be described with reference to FIGS. 5 and 6.
- FIG. FIG. 5 is a circuit diagram showing an example of the configuration of a pixel circuit 900 to which the embodiment of the present disclosure can be applied
- FIG. 6 is an explanatory diagram showing an example of connections between the plurality of pixel circuits 900.
- the pixel circuit 900 includes a SPAD (light receiving element) 512, transistors 902 and 904, a constant current source 910, switch circuits 920 and 940, an inverter circuit 930, and an OR circuit (logical sum circuit). ) 950.
- the SPAD 512 is a single photon avalanche diode that converts incident light into an electrical signal by photoelectric conversion and outputs the electrical signal.
- the SPAD 512 converts incident photons (photons) into electrical signals by photoelectric conversion, and outputs pulses corresponding to incident photons.
- the SPAD 512 has a characteristic that when a large negative voltage that causes avalanche multiplication is applied to the cathode, electrons generated in response to the incidence of one photon cause avalanche multiplication and a large current flows. By using such characteristics of the SPAD 512, it is possible to detect the incidence of one photon with high sensitivity.
- the cathode of SPAD 512 is connected to the junction of two transistors 902 and 904 and to the input of inverter circuit 930 . Also, the anode of SPAD 512 is electrically connected to a voltage source of voltage (-Vbd). Voltage (-Vbd) is a large negative voltage for generating avalanche multiplication for SPAD 512 .
- a transistor 902 is a P-channel MOSFET (Metal Oxide Semiconductor Field Effect Transistor), and a transistor 904 is an N-channel MOSFET, which are electrically connected to each other at their sources and drains. Gates of the transistors 902 and 904 are electrically connected to the switch circuit 920 . Further, the source of transistor 902 is electrically connected to power supply voltage Vdd through constant current source 910 .
- MOSFET Metal Oxide Semiconductor Field Effect Transistor
- the switch circuit (driving switch) 920 is composed of a NAND circuit and functions as a switch for driving the SPAD 512 according to the control signal from the control section 570 described above. Specifically, switch circuit 920 controls transistors 902 and 904 to apply a reverse bias to SPAD 512 based on the control signals from the ACT_SPAD_V and ACT_SPAD_H lines (first and third control lines). Then, the SPAD 512 is activated by applying a reverse bias, and when photons are incident on the SPAD 512 in this state, avalanche multiplication starts, and current flows from the cathode to the anode of the SPAD 512 .
- the ACT_SPAD_V line and the ACT_SPAD_H line can each transmit the control signal of the pixel array section 510 to the switch circuit 920 of the pixel circuit 900 .
- the ON state/OFF state of the switch circuit 920 can be controlled for each pixel circuit 900 .
- the ON state of switch circuit 920 can render SPAD 912 active, while the OFF state of switch circuit 920 can render SPAD 912 inactive. Accordingly, power consumption of the SPAD 512 in the pixel array section 510 can be suppressed.
- the pixel signal from the SPAD 512 is input to the inverter circuit 930 .
- the inverter circuit 930 performs, for example, threshold determination on the input pixel signal, inverts the signal each time the pixel signal exceeds the threshold in the positive direction or the negative direction, and outputs a pulse signal.
- a pulse signal output from the inverter circuit 930 is input to a switch circuit (output switch) 940 .
- the switch circuit 940 is an AND circuit, and functions as a switch that controls the output of the pixel signal from the SPAD 512 according to the control signal from the control section 570 described above. Specifically, the switch circuit 940 outputs the pulse signal to the OR circuit 950 based on the control signals from the EN_VLINE line and the EN_AREA line (second and fourth control lines).
- the EN_VLINE line and the EN_AREA line can each transmit the control signal of the pixel array section 510 to the switch circuit 940 of the pixel circuit 900 .
- the ON state/OFF state of the switch circuit 940 can be controlled for each pixel circuit 900, and as a result, the output of each SPAD 512 can be controlled to be enabled/disabled.
- One input of the OR circuit 950 is the output of the switch circuit 940, and the other input is the output of the switch circuit 940 of the other pixel circuit 900 or the horizontal direction control from the control unit 570 described above.
- a signal is input via the I_SPAD line (fifth control line).
- the OR circuit 950 outputs according to the input signal.
- adjacent SPADs 512 across a predetermined number of rows for example, 20 rows
- the processing section 530 more specifically, the downsampling circuit (not shown)
- a plurality of pixel circuits 900 can be connected as shown in FIG. 6, for example, in the case of four SPAD cycles.
- the distance measuring device 500 to which the embodiments of the present disclosure can be applied can have a laminated structure in which a plurality of semiconductor substrates are laminated. Therefore, with reference to FIG. 7, an example of a laminated structure of a distance measuring device 500 to which an embodiment of the present disclosure can be applied will be described.
- FIG. 7 is a schematic diagram showing an example of a layered structure of a distance measuring device 500 to which an embodiment of the present disclosure can be applied.
- the distance measuring device 500 has a laminated structure in which two semiconductor substrates (first and second substrates) 200 and 250 are laminated. These semiconductor substrates 200 and 250 may be called semiconductor chips.
- the above-described pixel array section 510 is provided on the semiconductor substrate (first substrate) 200 shown on the upper side in FIG. are arranged in a matrix. Also, the transistors 902 and 904, the constant current source 910, the switch circuits 920 and 940, the inverter circuit 930, and the OR circuit 950 in the pixel circuit 900 are provided on the semiconductor substrate 250 shown on the lower side in FIG.
- the SPAD 512 can be electrically connected to each element on the semiconductor substrate 250 by, for example, CCC (Copper-Copper Connection).
- a semiconductor substrate 250 is provided with a logic array section 252 including transistors 902 and 904, a constant current source 910, switch circuits 920 and 940, an inverter circuit 930, and an OR circuit 950 in the pixel circuit 900. Furthermore, the semiconductor substrate 250 is provided with a processing unit 530 that processes pixel signals acquired by the SPAD 512 and a control unit 570 that controls the operation of the distance measuring device 500, etc., in proximity to the logic array unit 252. be able to. Control lines such as the ACT_SPAD_V line and ACT_SPAD_H line, the EN_VLINE line, the EN_AREA line, and the I_SPAD line are provided on the semiconductor substrate 250 .
- the layout on the semiconductor substrates 200 and 250 is not limited to the configuration example shown in FIG. 7, and various layouts can be selected. Furthermore, the distance measuring device 500 is not limited to being formed by laminating a plurality of semiconductor substrates, and may be formed by a single semiconductor substrate.
- FIG. 8 is a block diagram showing a configuration of part of the distance measuring device 500 according to the embodiment of the present disclosure
- FIG. 9 is an explanatory diagram for explaining the outline of the embodiment of the present disclosure.
- the control unit 570 can control the pixel circuit 900 through a plurality of control lines (specifically, the ACT_SPAD_V line and ACT_SPAD_H line, the EN_VLINE line, the EN_AREA line, the I_SPAD line, etc.). can.
- the pixel signal from the pixel circuit 900 may contain an error due to improper control as described above.
- the system using the distance measurement information from the distance measuring device 500 may malfunction. Therefore, the distance measuring device 500 is strongly required to detect the failure of the control line as described above.
- faults in control lines can be detected.
- a fault detection control signal having a predetermined pattern is generated in order to detect a fault in the control line.
- the controller 570 controls the pixel circuit 900 based on the control signal.
- the failure determination unit 534 provided in the processing unit 530 the pixel signal output from the pixel circuit 900 controlled based on the failure detection control signal is acquired, and the expected output value of the acquired pixel signal is determined. If it is different from , it is determined that there is a failure in the control line, and the host 700 or the like is notified.
- the present disclosure created by the present inventor, it is possible to detect a failure in the control line. Furthermore, according to the present embodiment, the above failure can be detected by the distance measuring device 500 alone without receiving an instruction from the host 700 while the distance measuring device 500 is in operation. In addition, in the present embodiment, it is not necessary to irradiate the light receiving unit 502 (pixel array unit 510) with light, and it is not necessary to obtain a signal serving as a reference for determination in advance, thereby facilitating failure detection. It can be carried out. The details of each embodiment of the present disclosure will be sequentially described below.
- distance measuring device 500 mainly includes pixel circuit 900 , control section 570 , and processing section 530 .
- control section 570 controls the processing section 530 .
- processing section 530 processing section 530 .
- each block included in the range finder 500 will be described in order, but the description of the parts overlapping with the range finder 500 described so far will be omitted.
- the pixel circuit 900 includes a SPAD (light receiving element) 512 , switch circuits 920 and 940 and an OR circuit (logical sum circuit) 950 .
- a switch circuit (drive switch) 920 drives the SPAD 512 by control signals from the vertical control section 572 and the horizontal control section 574 of the control section 570 via the ACT_SPAD_V line and the ACT_SPAD_H line (first and third control lines).
- the switch circuit (output switch) 940 switches the SPAD 512 according to control signals from the vertical control unit 572 and the horizontal control unit 574 of the control unit 570 via the EN_VLINE line and the EN_AREA line (second and fourth control lines). It functions as a switch that controls the output of pixel signals from.
- an OR circuit (logical sum circuit) 950 downsamples the pixel signals of the SPAD 512 to the processing section 530 according to the control signal from the vertical control section 572 of the control section 570 via the I_SPAD line (fifth control line). It can be output to circuit 532 .
- the downsampling circuit 532 may include a column shift circuit (not shown) that removes noise.
- the controller 570 includes a vertical controller (row-direction readout controller) 572 , a horizontal controller (column-direction readout controller) 574 , and a control signal generator 576 .
- the vertical control unit 572 can control the pixel circuits 900 (specifically, the operation and output of the SPAD 512) in the vertical direction, that is, in units of rows.
- the vertical control section 572 can also control the OR circuit 950 described above.
- the horizontal control section 574 can control the pixel circuits 900 (more specifically, the operation and output of the SPAD 512) in the horizontal direction, that is, in units of columns.
- control signal generator 576 can generate a control signal and output it to the vertical controller 572 and the horizontal controller 574 described above. Specifically, the control signal generation unit 576 generates a failure detection control signal having a predetermined pattern, for example, in order to detect failures in various control lines, and sends the above-described vertical control unit 572 and horizontal control unit 574 can be output.
- the processing section 530 mainly includes a downsampling circuit 532 and a failure determination section 534 .
- the down-sampling circuit 532 converts the pixel signal read from the pixel circuit 900 through the signal line and subjected to signal processing such as noise removal in a column shift circuit (not shown) into a digital signal. Furthermore, the down-sampling circuit 532 outputs the pixel signal converted into a digital signal to the failure determination section 534, which will be described later.
- the failure determination unit 534 acquires a pixel signal (output signal) from the pixel circuit 900 controlled by the failure detection control signal, determines whether the acquired pixel signal is different from an expected output value, and determines whether the acquired pixel signal is different from an expected output value. In this case, it is determined that the control line is faulty, and notifies the host 700 or the like.
- the present embodiment is not limited to the configuration of FIG. 8 showing the essential parts of the configuration of the distance measuring device 500, and other elements and the like may be added.
- FIG. 10 is a flow chart of processing procedures according to this embodiment.
- the processing procedure according to this embodiment can mainly include a plurality of steps from step S101 to step S107. Details of each of these steps according to the present embodiment will be described below.
- the processing for failure detection according to the present embodiment is performed while the ranging device 500 is in operation and the frame synchronization signal is inactive. Furthermore, the process is repeatedly performed at a suitable frequency while the distance measuring device 500 is in operation.
- the ranging device 500 determines the content of failure detection (step S101). For example, in the present embodiment, in addition to detection of control line failure (SPAD access check), failure of the processing unit 530 may be detected (data path check), and failure of the pixel circuit 900 (light receiving unit) may be detected. It may be detected (SPAD check). In this embodiment, if the processing unit 530 breaks down, it may become impossible to detect a fault in the control line or the pixel circuit 900, so it is preferable to first perform the data path check. That is, it is preferable to perform the SPAD access check and the SPAD check in this order following the data path check.
- the order of the SPAD access check and the SPAD check is not limited, and the order may be reversed, and only one of the failure detections may be performed. In this embodiment, it is possible to select whether to execute each fault detection by the enable register corresponding to each fault detection.
- the distance measuring device 500 generates a failure detection control signal having a predetermined pattern for performing failure detection in a predetermined order based on the determination in step S101 described above (step S102). For example, when the data path check, SPAD access check, and SPAD check are performed in this order, the control signal for data path check, the control signal for SPAD access check, and the control signal for SPAD check should follow within the control signal for failure detection. becomes.
- the distance measuring device 500 performs control for failure detection using the failure detection control signal generated in step S102 (step S103).
- the distance measuring device 500 acquires pixel signals from the SPAD 512 under the control in step S103 described above (step S104).
- the ranging device 500 determines whether or not a failure has been detected (step S105). Specifically, the distance measuring device 500 determines that a failure has been detected, for example, when the pixel signal acquired in step S104 described above is different from the expected output value. If the ranging device 500 determines that a failure has been detected (step S105: Yes), it proceeds to the processing of step S106. Proceed to processing.
- the range finder 500 notifies the host 700 and the like of the result indicating that a failure has been detected (step S106). For example, when a failure is detected, a Low level signal may be output from a predetermined output terminal (error output terminal) of the distance measuring device 500 . Further, when a failure is detected, an ON signal (High level signal) may be written in an area indicating an error status in the status signal indicating the state of the ranging device 500 transmitted from the ranging device 500 to the host 700. Information indicating an error may be written on the signal (MIPI data) output from the interface 590 . Such information may not only be notified to the host 700, but may also be stored in a memory or the like (not shown).
- the ranging device 500 may notify detailed information such as information as to whether or not failure is detected in any of the data path check, SPAD access check, and SPAD check.
- the distance measuring device 500 outputs the processing result of the pixel signal acquired in step S104 described above, and ends the series of processing (step S107). It should be noted that, in the present embodiment, as described above, a series of processes are repeatedly performed at a suitable frequency while the distance measuring device 500 is operating.
- FIG. 11 is a flowchart for explaining an overview of the SPAD access check flow according to this embodiment.
- the SPAD access check primarily detects control line failures, as described above.
- a check is performed to detect failure of the I_SPAD line (fifth control line) connecting the OR circuit (logical sum circuit) 950 and the vertical control section 572 and the like.
- a check is performed to detect failures in the EN_VLINE and EN_AREA lines (second and fourth control lines) connecting the switch circuit 940 and the vertical control section 572 and the horizontal control section 574 of the control section 570 .
- the ACT_SPAD_V line and the ACT_SPAD_H line first and third control lines connecting the switch circuit 920 and the vertical control section 572 and the horizontal control section 574 of the control section 570 are checked for failure detection.
- checks are made to detect failures in the downsampling circuit 532 .
- the order is not limited to that shown in FIG. 11, and the order may be changed.
- the check may be stopped when a failure is detected; is preferred. By doing so, it is possible to recognize the failure point while shortening the check time. Details of each stage of the SPAD access check will be described below.
- FIG. 12 is a table showing a control signal pattern (first signal) for failure check of the I_SPAD line
- FIG. 13 is an explanation for explaining details of the control signal pattern for failure check of the I_SPAD line. It is a diagram.
- test #1 of FIG. 12 all target regions (all pixel circuits 900 including the I_SPAD line) are short-circuited with the power supply voltage (fixed at High level). Specifically, the control signals from the vertical control section 572 and the horizontal control section 574 to the switch circuits 920 and 940 through the ACT_SPAD_V line, ACT_SPAD_H line, EN_VLINE line, and EN_AREA line are set to Low level. Further, the control signal from the vertical control section 572 to the OR circuit 950 through the I_SPAD line is set to Low level. At this time, if a pixel signal corresponding to the short circuit with the power supply voltage is output, it can be confirmed that the electrical connection of the I_SPAD line is secured, that is, there is no failure.
- pattern A Used as a control signal from the vertical control section 572 to the circuit 950 via the I_SPAD line.
- pattern A As shown in the upper part of FIG. 13, in a plurality of I_SPAD lines located in the same hierarchy, two I_SPAD lines in the OFF state are arranged between the I_SPAD lines in the ON state, A pattern is prepared that does not match the state of adjacent I_SPAD lines located in different hierarchies.
- pattern B as shown in the lower part of FIG. 13, the pattern A is horizontally shifted one by one. Detectable locations in each pattern are locations indicated by solid lines in FIG. 13 .
- the control signals from the vertical control unit 572 and horizontal control unit 574 to the switch circuits 920 and 940 through the ACT_SPAD_V line, ACT_SPAD_H line, EN_VLINE line, and EN_AREA line are set to Low level. .
- a pixel signal corresponding to the electrical connection of patterns A and B shown in FIG. 13 is output, it can be confirmed that the electrical connection of the I_SPAD line is secured, that is, there is no failure. It will happen.
- FIG. 14 is a table showing control signal patterns (second signals) for failure checking of the EN_VLINE and EN_AREA lines
- FIG. 15 is a detailed control signal pattern for failure checking of the EN_VLINE and EN_AREA lines. It is an explanatory view for explaining.
- the EN_VLINE line connected to the switch circuits 940 of all the target regions (all pixel circuits 900) is shorted to the power supply voltage (fixed to High level), and the power supply voltage and the EN_VLINE line are electrically connected.
- the control signals from the vertical control section 572 and the horizontal control section 574 to the switch circuits 920 of all target regions (all pixel circuits 900) through the ACT_SPAD_V line and the ACT_SPAD_H line are set to Low level.
- the control signal from the horizontal control unit 574 to the switch circuits 940 of all the target areas (all the pixel circuits 900) through the EN_AREA line is set to Low level, and all the target areas (all the pixel circuits 900)
- the control signal from the vertical control section 572 to the switch circuit 940 through the EN_VLINE line is set to High level.
- the EN_AREA lines connected to the switch circuits 940 of all target regions (all pixel circuits 900) are shorted to the power supply voltage (fixed at a high level), and the power supply voltage and the EN_AREA lines are electrically connected.
- the control signal from the vertical control unit 572 to the switch circuit 940 of all the target regions (all the pixel circuits 900) through the EN_VLINE line is set to Low level, and all the target regions (all the pixel circuits 900)
- the control signal from the horizontal control unit 574 to the switch circuit 940 through the EN_AREA line is set to High level.
- a pixel signal corresponding to the short-circuit with the power supply voltage is output, it can be confirmed that the electrical connection of the EN_AREA line is secured, that is, there is no failure.
- each target area of 21 columns and rows (21 rows of pixel circuits 900) is shorted to GND (fixed at Low level).
- GND fixed at Low level.
- the control signals from the vertical control section 572 and the horizontal control section 574 to the switch circuits 920 of all target regions (all pixel circuits 900) through the ACT_SPAD_V line and the ACT_SPAD_H line are set to Low level.
- the control signal from the vertical control section 572 to the switch circuit 940 of each target region (the pixel circuits 900 for a predetermined row) through the EN_VLINE line is set to High level, and all the target regions (all the pixel circuits 900), the control signal from the horizontal control unit 574 to the switch circuit 940 via the EN_AREA line is set to High level.
- the electrical connection between the EN_VLINE line and the EN_AREA line in the target area (the pixel circuits 900 for a predetermined row) is secured. It can be done, that is, there is no failure.
- test #12 in FIG. 14 the electrical connection in the column direction of the target area from row 0 to row 20 (the pixel circuits 900 from row 0 to row 20) is confirmed. Specifically, the control signals from the vertical control section 572 and the horizontal control section 574 to the switch circuits 920 of all target regions (all pixel circuits 900) through the ACT_SPAD_V line and the ACT_SPAD_H line are set to Low level.
- the control signal from the vertical control unit 572 to the switch circuit 940 of the target area from the 0th row to the 20th row (the pixel circuit 900 from the 0th row to the 20th row) through the EN_VLINE line is set to High level
- FIG. 16 is a table showing control signal patterns (third signals) for failure checking of the ACT_SPAD_V line and ACT_SPAD_H line
- FIG. 17 is a detailed control signal pattern for failure checking of the ACT_SPAD_V line and ACT_SPAD_H line. It is an explanatory view for explaining.
- the electrical connection of the ACT_SPAD_V line and ACT_SPAD_H line to the power supply voltage and GND is checked, but here, if even one connection line failure is detected, it is determined to be a failure.
- detection is performed in a state where all target regions (all pixel circuits 900) are short-circuited with GND (fixed at Low level), and then, as shown on the right side of FIG. Detection is performed in a state of being short-circuited with the power supply voltage in the row direction and column direction (fixed at high level) for each target region of every 21 rows. Note that, for example, in the H level fixation, a failure can be detected when a Low level signal is output.
- test #1 of FIG. 16 as shown on the left side of FIG. 17, all target regions (all pixel circuits 900) are shorted to GND (fixed at Low level).
- the control signals through the ACT_SPAD_V line, ACT_SPAD_H line, EN_VLINE line, and EN_AREA line from the vertical control unit 572 and the horizontal control unit 574 are set to High level. At this time, when a High level signal is output, failures in the ACT_SPAD_H line and the ACT_SPAD_V line can be detected.
- the electrical connection with the power supply voltage in the row direction of the target area (pixel circuits 900 for 21 rows) every 21 rows is confirmed (fixed at high level).
- the control signal from the vertical control unit 572 to the switch circuits 920 of all target regions (all pixel circuits 900) through the ACT_SPAD_V line is set to Low level
- all target regions (all pixel circuits 900 ) from the horizontal control unit 574 to the switch circuit 920 via the ACT_SPAD_H line is set to a high level.
- the control signal from the vertical control unit 572 to the switch circuit 940 of the target region (pixel circuit 900 for 21 rows) for every 21 rows is set to High level via the EN_VLINE line, and all target regions (all pixels
- the control signal from the horizontal control section 574 to the switch circuit 940 of the circuit 900) through the EN_AREA line is set to High level.
- a failure of the ACT_SPAD_V line in the target area can be detected.
- the electrical connection with the power supply voltage in the column direction of the target area (pixel circuits 900 for 21 rows) for every 21 rows is confirmed (High level fixation).
- the control signal from the vertical control section 572 to the switch circuits 920 of all target regions (all pixel circuits 900) through the ACT_SPAD_V line is set to High level
- all target regions (all pixel circuits 900 ) from the horizontal control unit 574 to the switch circuit 920 via the ACT_SPAD_H line is set to Low level.
- the control signal from the vertical control unit 572 to the switch circuit 940 of the target region (pixel circuit 900 for 21 rows) for every 21 rows is set to High level via the EN_VLINE line, and all target regions (all pixels
- the control signal from the horizontal control section 574 to the switch circuit 940 of the circuit 900) through the EN_AREA line is set to High level.
- a failure of the ACT_SPAD_H line in the target area can be detected.
- FIG. 18 is a table showing control signal patterns (fourth signals) for failure checking of the downsampling circuit 532, and FIG. It is an explanatory view for explaining details.
- failures of the downsampling circuits 532 corresponding to the sequentially selected target regions are sequentially detected.
- the presence or absence of failures in the column direction of target regions of rows 0 to 20 is sequentially checked.
- the ACT_SPAD_V and ACT_SPAD_H lines from the vertical control section 572 and the horizontal control section 574 to the switch circuits 920 of all target regions (all pixel circuits 900) are connected.
- the control signal through it is set to Low level.
- the control signal from the vertical control section 572 to the switch circuit 940 of the target area of rows 0 to 20 (pixel circuits 900 of rows 0 to 20) through the EN_VLINE line is set to High level.
- the control signal from the horizontal control section 574 to the switch circuit 940 via the EN_AREA line is set to High level for each of the plurality of columns.
- the control signal is sequentially set to High level while shifting the columns horizontally. At this time, if a pixel signal corresponding to the control signal pattern shown in FIG. 18 is output, it means that the down-sampling circuit 532 has no failure.
- a signal indicating that a failure has been detected is output only when a failure is detected after performing a series of detections.
- a Low level signal is output from a predetermined output terminal (error output terminal) of the distance measuring device 500. may be output.
- An ON signal (High level signal) may be written.
- failures in control lines can be detected.
- the above failure can be detected by the distance measuring device 500 alone without receiving an instruction from the host 700 while the distance measuring device 500 is in operation.
- FIG. 21 is a block diagram showing a configuration example of a vehicle control system 11, which is an example of a mobile device control system to which the present technology is applied.
- the vehicle control system 11 is provided in the vehicle 1 and performs processing related to driving support and automatic driving of the vehicle 1.
- the vehicle control system 11 includes a vehicle control ECU (Electronic Control Unit) 21, a communication unit 22, a map information accumulation unit 23, a position information acquisition unit 24, an external recognition sensor 25, an in-vehicle sensor 26, a vehicle sensor 27, a storage unit 28, a travel It has a support/automatic driving control unit 29 , a DMS (Driver Monitoring System) 30 , an HMI (Human Machine Interface) 31 , and a vehicle control unit 32 .
- Vehicle control ECU 21, communication unit 22, map information storage unit 23, position information acquisition unit 24, external recognition sensor 25, in-vehicle sensor 26, vehicle sensor 27, storage unit 28, driving support/automatic driving control unit 29, driver monitoring system ( DMS) 30 , human machine interface (HMI) 31 , and vehicle control unit 32 are connected via a communication network 41 so as to be able to communicate with each other.
- the communication network 41 is, for example, a CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), FlexRay (registered trademark), Ethernet (registered trademark), and other digital two-way communication standards. It is composed of a communication network, a bus, and the like.
- the communication network 41 may be used properly depending on the type of data to be transmitted.
- CAN may be applied to data related to vehicle control
- Ethernet may be applied to large-capacity data.
- each unit of the vehicle control system 11 communicates without the communication network 41, for example, near field communication (NFC (Near Field Communication)) or Bluetooth (registered trademark), which is assumed to be relatively short-distance communication. They may be directly connected using communications.
- NFC Near Field Communication
- Bluetooth registered trademark
- the vehicle control ECU 21 is composed of various processors such as a CPU (Central Processing Unit) and an MPU (Micro Processing Unit).
- the vehicle control ECU 21 can control the functions of the entire vehicle control system 11 or a part of the functions.
- the communication unit 22 can communicate with various devices inside and outside the vehicle, other vehicles, servers, base stations, etc., and transmit and receive various data. At this time, the communication unit 22 may perform communication using a plurality of communication methods.
- the communication unit 22 uses a wireless communication method such as 5G (5th generation mobile communication system), LTE (Long Term Evolution), DSRC (Dedicated Short Range Communications), etc., via a base station or access point, on an external network can communicate with a server (hereinafter referred to as an external server) located in the
- the external network with which the communication unit 22 communicates is, for example, the Internet, a cloud network, or a provider's own network.
- the communication method that the communication unit 22 performs with the external network is not particularly limited as long as it is a wireless communication method that enables digital two-way communication at a communication speed of a predetermined value or more and a distance of a predetermined value or more.
- the communication unit 22 can communicate with a terminal existing in the vicinity of the own vehicle using P2P (Peer To Peer) technology.
- Terminals in the vicinity of one's own vehicle are, for example, terminals worn by pedestrians, bicycles, and other moving objects that move at relatively low speeds, terminals installed at fixed locations in stores, etc., or MTC (Machine Type Communication) terminal.
- the communication unit 22 can also perform V2X communication.
- V2X communication includes, for example, vehicle-to-vehicle communication with other vehicles, vehicle-to-infrastructure communication with roadside equipment, etc., and vehicle-to-home communication , and communication between the vehicle and others, such as vehicle-to-pedestrian communication with a terminal or the like possessed by a pedestrian.
- the communication unit 22 can receive from the outside a program for updating the software that controls the operation of the vehicle control system 11 (Over The Air). Furthermore, the communication unit 22 can receive map information, traffic information, information around the vehicle 1, and the like from the outside. Further, for example, the communication unit 22 can transmit information about the vehicle 1, information about the surroundings of the vehicle 1, and the like to the outside. The information about the vehicle 1 that the communication unit 22 transmits to the outside includes, for example, data indicating the state of the vehicle 1, recognition results by the recognition unit 73, and the like. Furthermore, for example, the communication unit 22 can also perform communication corresponding to a vehicle emergency call system such as e-call.
- a vehicle emergency call system such as e-call.
- the communication unit 22 can also receive electromagnetic waves transmitted by a vehicle information and communication system (VICS (registered trademark)) such as radio beacons, optical beacons, and FM multiplex broadcasting.
- VICS vehicle information and communication system
- the communication unit 22 can communicate with each device in the vehicle using, for example, wireless communication.
- the communication unit 22 performs wireless communication with devices in the vehicle using a communication method such as wireless LAN, Bluetooth, NFC, and WUSB (Wireless USB) that enables digital two-way communication at a communication speed higher than a predetermined value. can be done.
- the communication unit 22 can also communicate with each device in the vehicle using wired communication.
- the communication unit 22 can communicate with each device in the vehicle by wired communication via a cable connected to a connection terminal (not shown).
- the communication unit 22 performs digital two-way communication at a predetermined communication speed or higher by wired communication, such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), and MHL (Mobile High-Definition Link). can communicate with each device in the vehicle.
- wired communication such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface) (registered trademark), and MHL (Mobile High-Definition Link).
- equipment in the vehicle refers to equipment that is not connected to the communication network 41 in the vehicle, for example.
- in-vehicle devices include mobile devices and wearable devices possessed by passengers such as drivers, information devices that are brought into the vehicle and temporarily installed, and the like.
- the map information accumulation unit 23 can accumulate one or both of a map obtained from the outside and a map created by the vehicle 1. For example, the map information accumulation unit 23 accumulates a three-dimensional high-precision map, a global map covering a wide area, and the like, which is lower in accuracy than the high-precision map.
- High-precision maps are, for example, dynamic maps, point cloud maps, vector maps, etc.
- the dynamic map is, for example, a map consisting of four layers of dynamic information, quasi-dynamic information, quasi-static information, and static information, and is provided to the vehicle 1 from an external server or the like.
- a point cloud map is a map composed of a point cloud (point cloud data).
- a vector map is a map adapted to ADAS (Advanced Driver Assistance System) and AD (Autonomous Driving) by associating traffic information such as lane and traffic signal positions with a point cloud map.
- the point cloud map and the vector map may be provided from an external server or the like, and based on the sensing results of the camera 51, radar 52, LiDAR 53, etc., as a map for matching with a local map described later. It may be created by the vehicle 1 and stored in the map information storage unit 23 . Further, when a high-precision map is provided from an external server or the like, in order to reduce the communication capacity, map data of, for example, several hundred meters square, regarding the planned route that the vehicle 1 will travel from now on, is acquired from the external server or the like. .
- the location information acquisition unit 24 can receive GNSS signals from GNSS (Global Navigation Satellite System) satellites and acquire location information of the vehicle 1 .
- the acquired position information is supplied to the driving support/automatic driving control unit 29 .
- the location information acquisition unit 24 is not limited to the method using GNSS signals, and may acquire location information using beacons, for example.
- the external recognition sensor 25 has various sensors used to recognize the situation outside the vehicle 1 and can supply sensor data from each sensor to each part of the vehicle control system 11 .
- the types and number of sensors included in the external recognition sensor 25 are not particularly limited.
- the external recognition sensor 25 has a camera 51 , a radar 52 , a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) 53 , and an ultrasonic sensor 54 .
- the configuration is not limited to this, and the external recognition sensor 25 may have one or more sensors among the camera 51 , radar 52 , LiDAR 53 , and ultrasonic sensor 54 .
- the numbers of cameras 51 , radars 52 , LiDARs 53 , and ultrasonic sensors 54 are not particularly limited as long as they are realistically installable in the vehicle 1 .
- the type of sensor provided in the external recognition sensor 25 is not limited to this example, and the external recognition sensor 25 may have other types of sensors. An example of the sensing area of each sensor included in the external recognition sensor 25 will be described later.
- the imaging method of the camera 51 is not particularly limited.
- cameras of various shooting methods such as a ToF (Time of Flight) camera, a stereo camera, a monocular camera, and an infrared camera, which are shooting methods capable of distance measurement, can be applied to the camera 51 as necessary.
- the camera 51 is not limited to this, and may simply acquire a photographed image regardless of distance measurement.
- the external recognition sensor 25 can have an environment sensor for detecting the environment for the vehicle 1 .
- the environment sensor is a sensor for detecting the environment such as weather, climate, brightness, etc., and can include various sensors such as raindrop sensors, fog sensors, sunshine sensors, snow sensors, and illuminance sensors.
- the external recognition sensor 25 has a microphone used for detecting sounds around the vehicle 1 and the position of the sound source.
- the in-vehicle sensor 26 has various sensors for detecting information inside the vehicle, and can supply sensor data from each sensor to each part of the vehicle control system 11 .
- the types and number of various sensors included in the in-vehicle sensor 26 are not particularly limited as long as they are the types and number that can be realistically installed in the vehicle 1 .
- the in-vehicle sensor 26 can have one or more sensors among cameras, radar, seating sensors, steering wheel sensors, microphones, and biosensors.
- the camera provided in the in-vehicle sensor 26 for example, cameras of various shooting methods capable of distance measurement, such as a ToF camera, a stereo camera, a monocular camera, and an infrared camera, can be used.
- the camera included in the in-vehicle sensor 26 is not limited to this, and may simply acquire a photographed image regardless of distance measurement.
- the biosensors included in the in-vehicle sensor 26 are provided, for example, on a seat, a steering wheel, or the like, and detect various biometric information of a passenger such as a driver.
- the vehicle sensor 27 has various sensors for detecting the state of the vehicle 1 and can supply sensor data from each sensor to each part of the vehicle control system 11 .
- the types and number of various sensors included in the vehicle sensor 27 are not particularly limited as long as the types and number are practically installable in the vehicle 1 .
- the vehicle sensor 27 can have a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU (Inertial Measurement Unit)) integrating them.
- the vehicle sensor 27 has a steering angle sensor that detects the steering angle of the steering wheel, a yaw rate sensor, an accelerator sensor that detects the amount of operation of the accelerator pedal, and a brake sensor that detects the amount of operation of the brake pedal.
- the vehicle sensor 27 includes a rotation sensor that detects the number of rotations of an engine or a motor, an air pressure sensor that detects tire air pressure, a slip rate sensor that detects a tire slip rate, and a wheel speed sensor that detects the rotational speed of a wheel. It has a sensor.
- the vehicle sensor 27 has a battery sensor that detects the remaining battery level and temperature, and an impact sensor that detects external impact.
- the storage unit 28 includes at least one of a nonvolatile storage medium and a volatile storage medium, and can store data and programs.
- the storage unit 28 is used, for example, as EEPROM (Electrically Erasable Programmable Read Only Memory) and RAM (Random Access Memory), and as a storage medium, magnetic storage devices such as HDD (Hard Disc Drive), semiconductor storage devices, optical storage devices, And a magneto-optical storage device can be applied.
- the storage unit 28 stores various programs and data used by each unit of the vehicle control system 11 .
- the storage unit 28 has an EDR (Event Data Recorder) and a DSSAD (Data Storage System for Automated Driving), and stores information of the vehicle 1 before and after an event such as an accident and information acquired by the in-vehicle sensor 26. .
- EDR Event Data Recorder
- DSSAD Data Storage System for Automated Driving
- the driving support/automatic driving control unit 29 can control driving support and automatic driving of the vehicle 1 .
- the driving support/automatic driving control unit 29 has an analysis unit 61 , an action planning unit 62 , and an operation control unit 63 .
- the analysis unit 61 can analyze the vehicle 1 and its surroundings.
- the analysis unit 61 has a self-position estimation unit 71 , a sensor fusion unit 72 and a recognition unit 73 .
- the self-position estimation unit 71 can estimate the self-position of the vehicle 1 based on the sensor data from the external recognition sensor 25 and the high-precision map accumulated in the map information accumulation unit 23. For example, the self-position estimation unit 71 generates a local map based on sensor data from the external recognition sensor 25, and estimates the self-position of the vehicle 1 by matching the local map and the high-precision map.
- the position of the vehicle 1 can be based on, for example, the rear wheel-to-axle center.
- a local map is, for example, a three-dimensional high-precision map created using techniques such as SLAM (Simultaneous Localization and Mapping), an occupancy grid map, or the like.
- the three-dimensional high-precision map is, for example, the point cloud map described above.
- the occupancy grid map is a map that divides the three-dimensional or two-dimensional space around the vehicle 1 into grids (lattice) of a predetermined size and shows the occupancy state of objects in grid units.
- the occupancy state of an object is indicated, for example, by the presence or absence of the object and the existence probability.
- the local map is also used, for example, by the recognizing unit 73 for detection processing and recognition processing of the situation outside the vehicle 1 .
- the self-position estimation unit 71 may estimate the self-position of the vehicle 1 based on the position information acquired by the position information acquisition unit 24 and the sensor data from the vehicle sensor 27.
- the sensor fusion unit 72 combines a plurality of different types of sensor data (for example, image data supplied from the camera 51 and sensor data supplied from the radar 52) to perform sensor fusion processing to obtain new information. be able to. Methods for combining different types of sensor data may include integration, fusion, federation, and the like.
- the recognition unit 73 can execute a detection process for detecting the situation outside the vehicle 1 and a recognition process for recognizing the situation outside the vehicle 1 .
- the recognition unit 73 performs detection processing and recognition processing of the situation outside the vehicle 1 based on information from the external recognition sensor 25, information from the self-position estimation unit 71, information from the sensor fusion unit 72, and the like. .
- the recognition unit 73 performs detection processing and recognition processing of objects around the vehicle 1 .
- Object detection processing is, for example, processing for detecting the presence or absence, size, shape, position, movement, and the like of an object.
- Object recognition processing is, for example, processing for recognizing an attribute such as the type of an object or identifying a specific object.
- the detection process and the recognition process are not always clearly separated, and may overlap.
- the recognition unit 73 detects objects around the vehicle 1 by clustering the point cloud based on sensor data from the radar 52 or the LiDAR 53 or the like for each cluster of point groups. As a result, presence/absence, size, shape, and position of objects around the vehicle 1 are detected.
- the recognizing unit 73 detects the movement of objects around the vehicle 1 by performing tracking that follows the movement of the cluster of points classified by clustering. As a result, the speed and traveling direction (movement vector) of the object around the vehicle 1 are detected.
- the recognition unit 73 detects or recognizes vehicles, people, bicycles, obstacles, structures, roads, traffic lights, traffic signs, road markings, etc. based on image data supplied from the camera 51 . Further, the recognition unit 73 may recognize types of objects around the vehicle 1 by performing recognition processing such as semantic segmentation.
- the recognition unit 73 based on the map accumulated in the map information accumulation unit 23, the estimation result of the self-position by the self-position estimation unit 71, and the recognition result of the object around the vehicle 1 by the recognition unit 73, Recognition processing of traffic rules around the vehicle 1 can be performed. Through this processing, the recognition unit 73 can recognize the position and state of traffic lights, the content of traffic signs and road markings, the content of traffic restrictions, the lanes in which the vehicle can travel, and the like.
- the recognition unit 73 can perform recognition processing of the environment around the vehicle 1 .
- the surrounding environment to be recognized by the recognition unit 73 includes the weather, temperature, humidity, brightness, road surface conditions, and the like.
- the action plan section 62 creates an action plan for the vehicle 1.
- the action planning unit 62 can create an action plan by performing route planning and route following processing.
- trajectory planning is the process of planning a rough route from the start to the goal. This route planning is called trajectory planning, and in the planned route, trajectory generation (local path planning) that can proceed safely and smoothly in the vicinity of the vehicle 1 in consideration of the motion characteristics of the vehicle 1. It also includes the processing to be performed.
- Route following is the process of planning actions to safely and accurately travel the route planned by route planning within the planned time.
- the action planning unit 62 can, for example, calculate the target speed and the target angular speed of the vehicle 1 based on the result of this route following processing.
- the motion control unit 63 can control the motion of the vehicle 1 in order to implement the action plan created by the action planning unit 62.
- the operation control unit 63 controls a steering control unit 81, a brake control unit 82, and a drive control unit 83 included in the vehicle control unit 32, which will be described later, so that the vehicle 1 can control the trajectory calculated by the trajectory plan. Acceleration/deceleration control and direction control are performed so as to advance.
- the operation control unit 63 performs cooperative control aimed at realizing ADAS functions such as collision avoidance or shock mitigation, follow-up driving, vehicle speed maintenance driving, collision warning of own vehicle, and lane deviation warning of own vehicle.
- the operation control unit 63 performs cooperative control aimed at automatic driving in which the vehicle autonomously travels without depending on the driver's operation.
- the DMS 30 can perform driver authentication processing, driver state recognition processing, etc., based on sensor data from the in-vehicle sensor 26 and input data input to the HMI 31, which will be described later.
- As the state of the driver to be recognized for example, physical condition, wakefulness, concentration, fatigue, gaze direction, drunkenness, driving operation, posture, etc. are assumed.
- the DMS 30 may perform authentication processing for passengers other than the driver and processing for recognizing the state of the passenger. Further, for example, the DMS 30 may perform recognition processing of the situation inside the vehicle based on the sensor data from the sensor 26 inside the vehicle. Conditions inside the vehicle to be recognized include temperature, humidity, brightness, smell, and the like, for example.
- the HMI 31 can input various data, instructions, etc., and present various data to the driver.
- the HMI 31 has an input device for human input of data.
- the HMI 31 generates an input signal based on data, instructions, etc. input from an input device, and supplies the input signal to each section of the vehicle control system 11 .
- the HMI 31 has operating elements such as a touch panel, buttons, switches, and levers as input devices.
- the HMI 31 is not limited to this, and may further have an input device capable of inputting information by a method other than manual operation using voice, gestures, or the like.
- the HMI 31 may use, as an input device, a remote control device using infrared rays or radio waves, or an external connection device such as a mobile device or wearable device corresponding to the operation of the vehicle control system 11 .
- the presentation of data by HMI31 will be briefly explained.
- the HMI 31 generates visual information, auditory information, and tactile information for the passenger or outside the vehicle.
- the HMI 31 performs output control for controlling the output, output content, output timing, output method, and the like of each generated information.
- the HMI 31 generates and outputs visual information such as an operation screen, a status display of the vehicle 1, a warning display, an image such as a monitor image showing the situation around the vehicle 1, and information indicated by light.
- the HMI 31 also generates and outputs information indicated by sounds such as voice guidance, warning sounds, warning messages, etc., as auditory information.
- the HMI 31 generates and outputs, as tactile information, information given to the passenger's tactile sense by force, vibration, movement, or the like.
- a display device that presents visual information by displaying an image by itself or a projector device that presents visual information by projecting an image can be applied.
- the display device displays visual information within the passenger's field of view, such as a head-up display, a transmissive display, or a wearable device with an AR (Augmented Reality) function. It may be a device.
- the HMI 31 can also use a display device provided in the vehicle 1, such as a navigation device, an instrument panel, a CMS (Camera Monitoring System), an electronic mirror, a lamp, etc., as an output device for outputting visual information.
- Audio speakers, headphones, and earphones can be applied as output devices for the HMI 31 to output auditory information.
- a haptic element using haptic technology can be applied as an output device for the HMI 31 to output tactile information.
- a haptic element is provided at a portion of the vehicle 1 that is in contact with a passenger, such as a steering wheel or a seat.
- the vehicle control unit 32 can control each unit of the vehicle 1.
- the vehicle control unit 32 has a steering control unit 81 , a brake control unit 82 , a drive control unit 83 , a body system control unit 84 , a light control unit 85 and a horn control unit 86 .
- the steering control unit 81 can detect and control the state of the steering system of the vehicle 1 .
- the steering system has, for example, a steering mechanism including a steering wheel, an electric power steering, and the like.
- the steering control unit 81 has, for example, a steering ECU that controls the steering system, an actuator that drives the steering system, and the like.
- the brake control unit 82 can detect and control the state of the brake system of the vehicle 1 .
- the brake system has, for example, a brake mechanism including a brake pedal, an ABS (Antilock Brake System), a regenerative brake mechanism, and the like.
- the brake control unit 82 has, for example, a brake ECU that controls the brake system, an actuator that drives the brake system, and the like.
- the drive control unit 83 can detect and control the state of the drive system of the vehicle 1 .
- the drive system includes, for example, an accelerator pedal, a driving force generator for generating driving force such as an internal combustion engine or a driving motor, and a driving force transmission mechanism for transmitting the driving force to the wheels.
- the drive control unit 83 has, for example, a drive ECU that controls the drive system, an actuator that drives the drive system, and the like.
- the body system control unit 84 can detect and control the state of the body system of the vehicle 1 .
- the body system includes, for example, a keyless entry system, smart key system, power window device, power seat, air conditioner, air bag, seat belt, shift lever, and the like.
- the body system control unit 84 has, for example, a body system ECU that controls the body system, an actuator that drives the body system, and the like.
- the light control unit 85 can detect and control the states of various lights of the vehicle 1 .
- Lights to be controlled include, for example, headlights, backlights, fog lights, turn signals, brake lights, projections, bumper displays, and the like.
- the light control unit 85 includes a light ECU for controlling lights, an actuator for driving lights, and the like.
- the horn control unit 86 can detect and control the state of the car horn of the vehicle 1 .
- the horn control unit 86 has, for example, a horn ECU for controlling the car horn, an actuator for driving the car horn, and the like.
- FIG. 22 is a diagram showing an example of sensing areas by the camera 51, radar 52, LiDAR 53, ultrasonic sensor 54, etc. of the external recognition sensor 25 in FIG. 22 schematically shows the vehicle 1 viewed from above, the left end side is the front end (front) side of the vehicle 1, and the right end side is the rear end (rear) side of the vehicle 1.
- a sensing area 101F and a sensing area 101B are examples of sensing areas of the ultrasonic sensor 54.
- FIG. The sensing area 101 ⁇ /b>F covers the periphery of the front end of the vehicle 1 with a plurality of ultrasonic sensors 54 .
- the sensing area 101B covers the periphery of the rear end of the vehicle 1 with a plurality of ultrasonic sensors 54 .
- the sensing results in the sensing area 101F and the sensing area 101B are used, for example, for parking assistance of the vehicle 1 and the like.
- Sensing areas 102F to 102B show examples of sensing areas of the radar 52 for short or medium range.
- the sensing area 102F covers the front of the vehicle 1 to a position farther than the sensing area 101F.
- the sensing area 102B covers the rear of the vehicle 1 to a position farther than the sensing area 101B.
- the sensing area 102L covers the rear periphery of the left side surface of the vehicle 1 .
- the sensing area 102R covers the rear periphery of the right side surface of the vehicle 1 .
- the sensing result in the sensing area 102F is used, for example, to detect vehicles, pedestrians, etc. existing in front of the vehicle 1.
- the sensing result in the sensing area 102B is used for the rear collision prevention function of the vehicle 1, for example.
- the sensing results in the sensing area 102L and the sensing area 102R are used, for example, to detect an object in a blind spot on the side of the vehicle 1, or the like.
- Sensing areas 103F to 103B show examples of sensing areas by the camera 51 .
- the sensing area 103F covers the front of the vehicle 1 to a position farther than the sensing area 102F.
- the sensing area 103B covers the rear of the vehicle 1 to a position farther than the sensing area 102B.
- the sensing area 103L covers the periphery of the left side surface of the vehicle 1 .
- the sensing area 103R covers the periphery of the right side surface of the vehicle 1 .
- the sensing results in the sensing area 103F can be used, for example, for recognition of traffic lights and traffic signs, lane departure prevention support systems, and automatic headlight control systems.
- a sensing result in the sensing area 103B can be used for parking assistance and a surround view system, for example.
- Sensing results in the sensing area 103L and the sensing area 103R can be used, for example, in a surround view system.
- the sensing area 104 shows an example of the sensing area of the LiDAR53.
- the sensing area 104 covers the front of the vehicle 1 to a position farther than the sensing area 103F.
- the sensing area 104 has a narrower lateral range than the sensing area 103F.
- the sensing results in the sensing area 104 are used, for example, to detect objects such as surrounding vehicles.
- a sensing area 105 shows an example of a sensing area of the long-range radar 52 .
- the sensing area 105 covers the front of the vehicle 1 to a position farther than the sensing area 104 .
- the sensing area 105 has a narrower lateral range than the sensing area 104 .
- the sensing results in the sensing area 105 are used, for example, for ACC (Adaptive Cruise Control), emergency braking, and collision avoidance.
- ACC Adaptive Cruise Control
- emergency braking emergency braking
- collision avoidance collision avoidance
- the sensing regions of the cameras 51, the radar 52, the LiDAR 53, and the ultrasonic sensors 54 included in the external recognition sensor 25 may have various configurations other than those shown in FIG. Specifically, the ultrasonic sensor 54 may also sense the sides of the vehicle 1 , and the LiDAR 53 may sense the rear of the vehicle 1 . Moreover, the installation position of each sensor is not limited to each example mentioned above. Also, the number of each sensor may be one or plural.
- the technology of the present disclosure can be applied to, for example, LiDAR53.
- LiDAR53 for example, by applying the technology of the present disclosure to the LiDAR 53 of the vehicle control system 11, it becomes possible to easily detect a failure occurring in the LiDAR 53, thereby preventing malfunction and erroneous detection of the LiDAR 53. Therefore, the LiDAR 53 can normally detect surrounding vehicles, etc., and the safety of the vehicle 1 can be ensured.
- a light-receiving section including a pixel array section composed of a plurality of light-receiving elements arranged in a matrix; a control signal generator that generates a control signal; a control unit electrically connected to the light receiving unit via a control line and controlling the light receiving unit based on the control signal; a processing unit electrically connected to the light receiving unit via a signal line and processing an output signal from the light receiving unit; a failure determination unit that detects a failure; with The failure determination unit detects a failure of the control line from the output signal from the light receiving unit controlled based on a failure detection control signal having a predetermined pattern. Photodetector.
- the light receiving unit is a drive switch provided for each light receiving element for driving the light receiving element; an output switch provided for each light-receiving element for controlling output of an output signal from the light-receiving element; A logical sum circuit that outputs according to the output signal from each light receiving element and the output signal from the other light receiving element,
- the control line is a first control line electrically connecting the vertical control unit and the drive switch; a second control line electrically connecting the vertical control unit and the output switch; a third control line electrically connecting the horizontal control unit and the drive switch; a fourth control line electrical
- the control signal generator outputs the failure detection control signal while the frame synchronization signal is inactive.
- the failure detection control signal includes a control line failure detection signal for detecting a failure in the control line following a processing unit failure detection signal for detecting a failure in the processing unit; A photodetector as described.
- the failure detection control signal includes a light receiving section failure detection signal for detecting a failure of the light receiving section before or after the control line failure detection signal.
- the control line failure detection signal includes a first signal for detecting a failure of the fifth control line.
- control line failure detection signal includes a second signal for detecting a failure of the second and fourth control lines.
- control line failure detection signal includes a third signal for detecting failure of the first and third control lines.
- control line failure detection signal includes a fourth signal for detecting failure of the downsampling circuit.
- control line failure detection signal includes signals in the order of the first signal, the second signal, the third signal, and the fourth signal.
- a lighting device that emits irradiation light; a photodetector that receives reflected light of the irradiation light reflected by a subject; including The photodetector is a light-receiving section including a pixel array section composed of a plurality of light-receiving elements arranged in a matrix; a control signal generator that generates a control signal; a control unit electrically connected to the light receiving unit via a control line and controlling the light receiving unit based on the control signal; a processing unit electrically connected to the light receiving unit via a signal line and processing an output signal from the light receiving unit; a failure determination unit that detects a failure; has The failure determination unit detects a failure of the control line from the output signal from the light receiving unit controlled based on a failure detection control signal having a predetermined pattern. ranging system.
- Vehicle 11 Vehicle Control System 21
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
1. 本発明者が本開示の実施形態を創作するに至る背景
1.1 測距方式
1.2 測距システム
1.3 測距装置
1.4 画素回路
1.5 積層構造
1.6 背景
2. 実施形態
2.1 測距装置
2.2 処理手順
3. まとめ
4. 応用例
5. 補足
<1.1 測距方式>
まずは、本開示の実施形態を説明する前に、本発明者が本開示の実施形態を創作するに至る背景について説明するが、最初に、本開示の各実施形態が適用され得る測距方式の概要を説明する。本開示は、光を用いて測距を行う技術に関するものである。本開示の実施形態では、測距方式として、直接ToF(Time of Flight)方式を適用する。直接ToF方式は、光源から射出された光が被測定物により反射した反射光を受光素子(詳細には、SPAD)により受光し、光の射出タイミングと受光タイミングとの差分の時間に基づき測距を行う方式である。
次に、図3を参照して、本開示の実施形態が適用され得る測距システム90の構成の一例を説明する。図3は、本開示の実施形態を適用可能な測距システム90の構成の一例を示すブロック図である。図3に示すように、当該測距システム90は、光源部(照明装置)400と、測距装置500と、記憶装置600と、ホスト700と、光学系800とを主に含むことができる。以下、当該測距システム90が含む各ブロックを順次説明する。
光源部400は、上述した図1の光源部301に対応し、レーザダイオード等からなり、例えばレーザ光をパルス状に発光するように駆動される。光源部400は、面光源としてレーザ光を射出するVCSEL(Vertical Cavity Surface Emitting Laser)を適用することができる。また、光源部400として、レーザダイオードをライン上に配列したアレイを用い、レーザダイオードアレイから射出されるレーザ光をラインに垂直の方向にスキャンする構成を用いてもよい。さらに、光源部400として、単光源としてのレーザダイオードを用い、レーザダイオードから射出されるレーザ光を水平および垂直方向にスキャンする構成を用いてもよい。
測距装置500は、上述した図1の受光部302を含む。さらに、当該受光部302は、例えば、2次元格子状(行列状)(例えば、189個×600個)に配列された複数の受光素子からなる画素アレイ部(図示省略)を有する。なお、当該測距装置500や受光部302等の詳細は後述する。さらに、光学系800は、外部から入射する光を、測距装置500の受光部302の上記画素アレイ部に導くことができる。
ホスト700は、測距システム90の全体の動作を制御することができる。例えば、ホスト700は、測距装置500に対して、光源部400を発光させるためのトリガである発光トリガを供給する。測距装置500は、この発光トリガに基づくタイミングで光源部400を発光させると共に、発光タイミングを示す時間t0を記憶する。また、ホスト700は、例えば外部からの指示に応じて、測距装置500に対して、測距の際のパターンの設定を行ってもよい。
次に、図4を参照して、本開示の実施形態が適用され得る測距装置500の構成の一例を説明する。図4は、本開示の実施形態を適用可能な測距装置500の構成の一例を示すブロック図である。図4に示すように、測距装置500は、画素アレイ部510を含む受光部502と、処理部530と、制御部570と、発光タイミング制御部580と、インタフェース(I/F)590とを主に含む。以下、当該測距装置500が含む各ブロックを順次説明する。
図4に示すように、受光部502は、画素アレイ部510を含む。そして、画素アレイ部510は、行列状に配列する複数のSPAD(受光素子)512を有する(例えば、189×600に配列する。)各SPAD512は、後述する制御部570により制御される。例えば、制御部570は、各SPAD512からの画素信号の読み出しを、行方向にp個、列方向にq個の、(p×q)個のSPAD512を含むブロック毎に制御することができる。また、制御部570は、上記ブロックを単位として、各SPAD512を行方向にスキャンし、さらに列方向にスキャンして、各SPAD512から画素信号を読み出すことができる。なお、受光部502の詳細については、後述する。
処理部530は、各SPAD512から信号線を介して読み出された画素信号を処理することができる。図4に示すように、処理部530は、変換部540と、生成部550と、信号処理部560とを含む。
制御部570は、例えば予め組み込まれるプログラムに従い、制御信号や、外部から供給される基準クロック信号に基づき、受光部502等の制御を実行することができる。さらに、制御部570は、上述したように、画素アレイ部510の所定領域を対象領域として、対象領域に含まれるSPAD512を、画素信号を読み出す対象とするように制御することができる。また、制御部570は、複数行(複数ライン)を纏めてスキャンし、それを列方向にさらにスキャンして、各SPAD512から画素信号を読み出すこともできる。
発光タイミング制御部580は、外部から供給される発光トリガ信号に従い発光タイミングを示す発光制御信号を生成する。発光制御信号は、光源部400に供給されると共に、処理部530に供給される。
詳細には、上述した受光部502は、複数のSPAD512のそれぞれを含む複数の画素回路900から構成される。そこで、図5及び図6を参照して、本開示の実施形態が適用可能な画素回路900の構成の一例を説明する。図5は、本開示の実施形態を適用可能な画素回路900の構成の一例を示す回路図であり、図6は、複数の画素回路900間の接続の一例を示す説明図である。
また、本開示の実施形態が適用され得る測距装置500は、複数の半導体基板を積層した積層構造を持つことができる。そこで、図7を参照して、本開示の実施形態が適用され得る測距装置500の積層構造の例を説明する。図7は、本開示の実施形態を適用可能な測距装置500の積層構造の一例を示す模式図である。
次に、上述した測距装置500の構成例を踏まえ、図8及び図9を参照して、本発明者が本開示の実施形態を創作するに至った背景の詳細を説明する。図8は、本開示の実施形態に係る測距装置500の一部の構成を示すブロック図であり、図9は、本開示の実施形態の概要を説明するための説明図である。
<2.1 測距装置>
まずは、図8を参照して、本開示の実施形態に係る測距装置500の構成の要部を説明する。図8に示すように、測距装置500は、画素回路900と、制御部570と、処理部530とを主に含む。以下、当該測距装置500が含む各ブロックを順次説明するが、これまで説明した測距装置500と重複する個所については、説明を省略する。
図8に示すように、画素回路900は、SPAD(受光素子)512と、スイッチ回路920、940と、OR回路(理論和回路)950とを含む。スイッチ回路(駆動スイッチ)920は、ACT_SPAD_V線及びACT_SPAD_H線(第1及び第3の制御線)を介した、制御部570の垂直制御部572及び水平制御部574からの制御信号により、SPAD512を駆動するスイッチとして機能する。また、スイッチ回路(出力スイッチ)940は、EN_VLINE線及びEN_AREA線(第2及び第4の制御線)を介した、制御部570の垂直制御部572及び水平制御部574からの制御信号により、SPAD512からの画素信号の出力を制御するスイッチとして機能する。さらに、OR回路(理論和回路)950は、I_SPAD線(第5の制御線)を介した、制御部570の垂直制御部572からの制御信号により、SPAD512の画素信号を処理部530のダウンサンプリング回路532に出力することができる。なお、ダウンサンプリング回路532には、ノイズを除去するカラムシフト回路(図示省略)が含まれていてもよい。
図8に示すように、制御部570は、垂直制御部(行方向読み出し制御部)572と、水平制御部(列方向読み出し制御部)574と、制御信号生成部576とを含む。垂直制御部572は、垂直方向、すなわち、行単位で画素回路900(詳細には、SPAD512の動作や出力)を制御することができる。さらに、垂直制御部572は、上述したOR回路950も制御することができる。また、水平制御部574は、水平方向、すなわち、列単位で画素回路900(詳細には、SPAD512の動作や出力)を制御することができる。
図8に示すように、処理部530は、ダウンサンプリング回路532と、故障判定部534とを主に含む。ダウンサンプリング回路532は、画素回路900から信号線を介して読み出され、図示されていないカラムシフト回路においてノイズ除去等の信号処理が施された画素信号をデジタル信号に変換する。さらに、ダウンサンプリング回路532は、デジタル信号に変換された画素信号を、後述する故障判定部534に出力する。
(処理手順の概要)
次に、図10を参照して、本実施形態の処理手順の概要を説明する。図10は、本実施形態に係る処理手順のフローチャートである。図10に示すように、本実施形態に係る処理手順は、ステップS101からステップS107までの複数のステップを主に含むことができる。以下に、本実施形態に係るこれら各ステップの詳細について説明する。
次に、図11を参照して、本実施形態に係る、制御線の故障を検出するSPADアクセスチェックの詳細について説明する。図11は、本実施形態に係るSPADアクセスチェックのフローの概要を説明するためのフローチャートである。SPADアクセスチェックでは、先に説明したように、制御線の故障を主に検出する。
図12及び図13を参照して、I_SPAD線の故障チェックの詳細について説明する。図12は、I_SPAD線の故障チェックのための制御信号パターン(第1の信号)を示す表であり、図13は、I_SPAD線の故障チェックのための制御信号パターンの詳細を説明するための説明図である。
図14及び図15を参照して、EN_VLINE線及びEN_AREA線の故障チェックの詳細について説明する。図14は、EN_VLINE線及びEN_AREA線の故障チェックのための制御信号パターン(第2の信号)を示す表であり、図15は、EN_VLINE線及びEN_AREA線の故障チェックのための制御信号パターンの詳細を説明するための説明図である。
図16及び図17を参照して、ACT_SPAD_V線及びACT_SPAD_H線の詳細について説明する。図16は、ACT_SPAD_V線及びACT_SPAD_H線の故障チェックのための制御信号パターン(第3の信号)を示す表であり、図17は、ACT_SPAD_V線及びACT_SPAD_H線の故障チェックのための制御信号パターンの詳細を説明するための説明図である。
図18及び図19を参照して、ダウンサンプリング回路532の故障チェックについて説明する。図18は、ダウンサンプリング回路532の故障チェックのための制御信号パターン(第4の信号)を示す表であり、図19は、ダウンサンプリング回路532の故障チェックの故障チェックのための制御信号パターンの詳細を説明するための説明図である。
また、本実施形態においては、故障検出時には、先に説明したように、一連の検出を行った後、故障が検出された場合にのみ、故障を検出した旨の信号を出力する。例えば、故障検出時のタイミングチャートの一例を示す図20の下段の領域R100に示されるように、例えば、故障検出時には、測距装置500の所定の出力端子(エラー出力端子)からLowレベル信号を出力してもよい。
以上のように、本開示の実施形態によれば、制御線の故障を検出することができる。詳細には、本実施形態によれば、測距装置500の作動中に、ホスト700からの指示を受けることなく、測距装置500だけで上記故障を検出することができる。また、本実施形態においては、受光部502(画素アレイ部510)に光を照射する必要もなく、事前に、判定のための基準となる信号を取得することがなく、容易に故障の検出を行うことができる。
図21を参照して、本開示で提案した技術が適用され得る移動装置制御システムの一例について説明する。図21は、本技術が適用される移動装置制御システムの一例である車両制御システム11の構成例を示すブロック図である。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
(1)
行列状に配列する複数の受光素子からなる画素アレイ部を含む受光部と、
制御信号を生成する制御信号生成部と、
前記受光部と制御線を介して電気的に接続され、前記制御信号に基づいて受光部を制御する制御部と、
前記受光部と信号線を介して電気的に接続され、前記受光部からの出力信号を処理する処理部と、
故障を検出する故障判定部と、
を備え、
前記故障判定部は、所定のパターンを有する故障検出用制御信号に基づいて制御された前記受光部からの前記出力信号により、前記制御線の故障を検出する、
光検出装置。
(2)
前記受光素子は、アバランシェフォトダイオードからなる、上記(1)に記載の光検出装置。
(3)
前記受光部は、
前記受光素子ごとに設けられた、前記受光素子を駆動する駆動スイッチと、
前記受光素子ごとに設けられた、前記受光素子からの出力信号の出力を制御する出力スイッチと、
前記各受光素子からの出力信号と他の受光素子からの出力信号に応じて出力する理論和回路と、
をさらに含む、上記(1)又は(2)に記載の光検出装置。
(4)
前記制御部は、
前記制御部は、
列単位で前記受光素子を制御する水平制御部と、
行単位で前記受光素子を制御する垂直制御部と、
を含む、上記(3)に記載の光検出装置。
(5)
前記制御線は、
前記垂直制御部と前記駆動スイッチとを電気的に接続する第1の制御線と、
前記垂直制御部と前記出力スイッチとを電気的に接続する第2の制御線と、
前記水平制御部と前記駆動スイッチとを電気的に接続する第3の制御線と、
前記水平制御部と前記出力スイッチとを電気的に接続する第4の制御線と、
前記垂直制御部と前記理論和回路とを電気的に接続する第5の制御線と、
を含む、上記(4)に記載の光検出装置。
(6)
前記処理部は、ダウンサンプリング回路を含む、上記(5)に記載の光検出装置。
(7)
前記制御信号生成部は、フレーム同期信号が非アクティブである間に、前記故障検出用制御信号を出力する、上記(6)に記載の光検出装置。
(8)
前記故障検出用制御信号は、前記処理部の故障を検出するための処理部故障検出信号に続いて、前記制御線の故障を検出するための制御線故障検出信号を含む、上記(7)に記載の光検出装置。
(9)
前記故障検出用制御信号は、前記制御線故障検出信号の前又は後に、前記受光部の故障を検出するための受光部故障検出信号を含む、上記(8)に記載の光検出装置。
(10)
前記制御線故障検出信号は、前記第5の制御線の故障を検出する第1の信号を含む、上記(9)に記載の光検出装置。
(11)
前記制御線故障検出信号は、前記第2及び第4の制御線の故障を検出する第2の信号を含む、上記(10)に記載の光検出装置。
(12)
前記制御線故障検出信号は、前記第1及び第3の制御線の故障を検出する第3の信号を含む、上記(11)に記載の光検出装置。
(13)
前記制御線故障検出信号は、前記ダウンサンプリング回路の故障を検出する第4の信号を含む、上記(12)に記載の光検出装置。
(14)
前記制御線故障検出信号は、前記第1の信号、前記第2の信号、前記第3の信号、前記第4の信号の順に各信号を含む、上記(13)に記載の光検出装置。
(15)
前記故障判定部は、故障を検出した場合、出力端子又はホストコンピュータに所定の信号を出力する、上記(1)~(14)のいずれか1つに記載の光検出装置。
(16)
前記画素アレイ部が設けられた第1の基板と、
前記第1の基板に積層し、前記制御部、前記処理部、及び、前記故障判定部が設けられた第2の基板と、
から形成される、上記(1)~(15)のいずれか1つに記載の光検出装置。
(17)
前記制御線は、前記第2の基板に設けられている、上記(16)に記載の光検出装置。
(18)
照射光を照射する照明装置と、
前記照射光が被写体により反射された反射光を受光する光検出装置と、
を含み、
前記光検出装置は、
行列状に配列する複数の受光素子からなる画素アレイ部を含む受光部と、
制御信号を生成する制御信号生成部と、
前記受光部と制御線を介して電気的に接続され、前記制御信号に基づいて受光部を制御する制御部と、
前記受光部と信号線を介して電気的に接続され、前記受光部からの出力信号を処理する処理部と、
故障を検出する故障判定部と、
を有し、
前記故障判定部は、所定のパターンを有する故障検出用制御信号に基づいて制御された前記受光部からの前記出力信号により、前記制御線の故障を検出する、
測距システム。
11 車両制御システム
21 車両制御ECU(Electronic Control Unit)
22 通信部
23 地図情報蓄積部
24 位置情報取得部
25 外部認識センサ
26 車内センサ
27 車両センサ
28 記憶部
29 走行支援・自動運転制御部
30 ドライバモニタリングシステム(DMS)
31 ヒューマンマシーンインタフェース(HMI)
32 車両制御部
41 通信ネットワーク
51 カメラ
52 レーダ
53 LiDAR
54 超音波センサ
61 分析部
62 行動計画部
63 動作制御部
71 自己位置推定部
72 センサフュージョン部
73 認識部
81 ステアリング制御部
82 ブレーキ制御部
83 駆動制御部
84 ボディ系制御部
85 ライト制御部
86 ホーン制御部
90 測距システム
200、250 半導体基板
252 ロジックアレイ部
300、500 測距装置
301、400 光源部
302、502 受光部
303 被測定物
310 頻度
311 範囲
312 アクティブ光成分
510 画素アレイ部
512 SPAD
530 処理部
532 ダウンサンプリング回路
534 故障判定部
540 変換部
550 生成部
560 信号処理部
570 制御部
572 垂直制御部
574 水平制御部
576 制御信号生成部
580 発光タイミング制御部
590 インタフェース
600 記憶装置
700 ホスト
800 光学系
900 画素回路
902、904 トランジスタ
910 定電流源
920、940 スイッチ回路
930 インバータ回路
950 OR回路
R100、R101 領域
Claims (18)
- 行列状に配列する複数の受光素子からなる画素アレイ部を含む受光部と、
制御信号を生成する制御信号生成部と、
前記受光部と制御線を介して電気的に接続され、前記制御信号に基づいて受光部を制御する制御部と、
前記受光部と信号線を介して電気的に接続され、前記受光部からの出力信号を処理する処理部と、
故障を検出する故障判定部と、
を備え、
前記故障判定部は、所定のパターンを有する故障検出用制御信号に基づいて制御された前記受光部からの前記出力信号により、前記制御線の故障を検出する、
光検出装置。 - 前記受光素子は、アバランシェフォトダイオードからなる、請求項1に記載の光検出装置。
- 前記受光部は、
前記受光素子ごとに設けられた、前記受光素子を駆動する駆動スイッチと、
前記受光素子ごとに設けられた、前記受光素子からの出力信号の出力を制御する出力スイッチと、
前記各受光素子からの出力信号と他の受光素子からの出力信号に応じて出力する理論和回路と、
をさらに含む、請求項1に記載の光検出装置。 - 前記制御部は、
列単位で前記受光素子を制御する水平制御部と、
行単位で前記受光素子を制御する垂直制御部と、
を含む、請求項3に記載の光検出装置。 - 前記制御線は、
前記垂直制御部と前記駆動スイッチとを電気的に接続する第1の制御線と、
前記垂直制御部と前記出力スイッチとを電気的に接続する第2の制御線と、
前記水平制御部と前記駆動スイッチとを電気的に接続する第3の制御線と、
前記水平制御部と前記出力スイッチとを電気的に接続する第4の制御線と、
前記垂直制御部と前記理論和回路とを電気的に接続する第5の制御線と、
を含む、請求項4に記載の光検出装置。 - 前記処理部は、ダウンサンプリング回路を含む、請求項5に記載の光検出装置。
- 前記制御信号生成部は、フレーム同期信号が非アクティブである間に、前記故障検出用制御信号を出力する、請求項6に記載の光検出装置。
- 前記故障検出用制御信号は、前記処理部の故障を検出するための処理部故障検出信号に続いて、前記制御線の故障を検出するための制御線故障検出信号を含む、請求項7に記載の光検出装置。
- 前記故障検出用制御信号は、前記制御線故障検出信号の前又は後に、前記受光部の故障を検出するための受光部故障検出信号を含む、請求項8に記載の光検出装置。
- 前記制御線故障検出信号は、前記第5の制御線の故障を検出する第1の信号を含む、請求項9に記載の光検出装置。
- 前記制御線故障検出信号は、前記第2及び第4の制御線の故障を検出する第2の信号を含む、請求項10に記載の光検出装置。
- 前記制御線故障検出信号は、前記第1及び第3の制御線の故障を検出する第3の信号を含む、請求項11に記載の光検出装置。
- 前記制御線故障検出信号は、前記ダウンサンプリング回路の故障を検出する第4の信号を含む、請求項12に記載の光検出装置。
- 前記制御線故障検出信号は、前記第1の信号、前記第2の信号、前記第3の信号、前記第4の信号の順に各信号を含む、請求項13に記載の光検出装置。
- 前記故障判定部は、故障を検出した場合、出力端子又はホストコンピュータに所定の信号を出力する、請求項1に記載の光検出装置。
- 前記画素アレイ部が設けられた第1の基板と、
前記第1の基板に積層し、前記制御部、前記処理部、及び、前記故障判定部が設けられた第2の基板と、
から形成される、請求項1に記載の光検出装置。 - 前記制御線は、前記第2の基板に設けられている、請求項16に記載の光検出装置。
- 照射光を照射する照明装置と、
前記照射光が被写体により反射された反射光を受光する光検出装置と、
を含み、
前記光検出装置は、
行列状に配列する複数の受光素子からなる画素アレイ部を含む受光部と、
制御信号を生成する制御信号生成部と、
前記受光部と制御線を介して電気的に接続され、前記制御信号に基づいて受光部を制御する制御部と、
前記受光部と信号線を介して電気的に接続され、前記受光部からの出力信号を処理する処理部と、
故障を検出する故障判定部と、
を有し、
前記故障判定部は、所定のパターンを有する故障検出用制御信号に基づいて制御された前記受光部からの前記出力信号により、前記制御線の故障を検出する、
測距システム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22815597.4A EP4350283A4 (en) | 2021-06-04 | 2022-03-02 | OPTICAL DETECTION DEVICE AND DISTANCE MEASURING SYSTEM |
JP2023525401A JPWO2022254839A1 (ja) | 2021-06-04 | 2022-03-02 | |
CN202280038296.9A CN117396775A (zh) | 2021-06-04 | 2022-03-02 | 光学检测装置和距离测量系统 |
KR1020237039340A KR20240018431A (ko) | 2021-06-04 | 2022-03-02 | 광 검출 장치 및 측거 시스템 |
US18/555,485 US20240201338A1 (en) | 2021-06-04 | 2022-03-02 | Photodetection device and distance measurement system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021094495 | 2021-06-04 | ||
JP2021-094495 | 2021-06-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022254839A1 true WO2022254839A1 (ja) | 2022-12-08 |
Family
ID=84324163
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/008765 WO2022254839A1 (ja) | 2021-06-04 | 2022-03-02 | 光検出装置及び測距システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US20240201338A1 (ja) |
EP (1) | EP4350283A4 (ja) |
JP (1) | JPWO2022254839A1 (ja) |
KR (1) | KR20240018431A (ja) |
CN (1) | CN117396775A (ja) |
TW (1) | TW202307461A (ja) |
WO (1) | WO2022254839A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015155854A (ja) * | 2014-02-21 | 2015-08-27 | オムロンオートモーティブエレクトロニクス株式会社 | レーザレーダ装置 |
WO2017209221A1 (ja) * | 2016-05-31 | 2017-12-07 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置および撮像方法、カメラモジュール、並びに電子機器 |
JP2020010150A (ja) * | 2018-07-06 | 2020-01-16 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置および撮像システム |
JP2020112528A (ja) | 2019-01-17 | 2020-07-27 | 株式会社デンソー | 光測距装置およびその制御方法 |
JP2020112501A (ja) | 2019-01-16 | 2020-07-27 | 株式会社デンソー | 光学的測距装置および光学的測距装置における異常の発生を検出する方法 |
JP2020143996A (ja) | 2019-03-06 | 2020-09-10 | 株式会社デンソー | 光学的測距装置 |
-
2022
- 2022-03-02 KR KR1020237039340A patent/KR20240018431A/ko unknown
- 2022-03-02 CN CN202280038296.9A patent/CN117396775A/zh active Pending
- 2022-03-02 EP EP22815597.4A patent/EP4350283A4/en active Pending
- 2022-03-02 JP JP2023525401A patent/JPWO2022254839A1/ja active Pending
- 2022-03-02 US US18/555,485 patent/US20240201338A1/en active Pending
- 2022-03-02 WO PCT/JP2022/008765 patent/WO2022254839A1/ja active Application Filing
- 2022-05-09 TW TW111117262A patent/TW202307461A/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015155854A (ja) * | 2014-02-21 | 2015-08-27 | オムロンオートモーティブエレクトロニクス株式会社 | レーザレーダ装置 |
WO2017209221A1 (ja) * | 2016-05-31 | 2017-12-07 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置および撮像方法、カメラモジュール、並びに電子機器 |
JP2020010150A (ja) * | 2018-07-06 | 2020-01-16 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置および撮像システム |
JP2020112501A (ja) | 2019-01-16 | 2020-07-27 | 株式会社デンソー | 光学的測距装置および光学的測距装置における異常の発生を検出する方法 |
JP2020112528A (ja) | 2019-01-17 | 2020-07-27 | 株式会社デンソー | 光測距装置およびその制御方法 |
JP2020143996A (ja) | 2019-03-06 | 2020-09-10 | 株式会社デンソー | 光学的測距装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4350283A4 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022254839A1 (ja) | 2022-12-08 |
KR20240018431A (ko) | 2024-02-13 |
EP4350283A4 (en) | 2024-09-04 |
US20240201338A1 (en) | 2024-06-20 |
CN117396775A (zh) | 2024-01-12 |
EP4350283A1 (en) | 2024-04-10 |
TW202307461A (zh) | 2023-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11940536B2 (en) | Light receiving element and ranging system | |
US20230230368A1 (en) | Information processing apparatus, information processing method, and program | |
US20210224617A1 (en) | Information processing device, information processing method, computer program, and mobile device | |
CN114424265A (zh) | 信号处理设备、信号处理方法、程序和移动设备 | |
WO2022254839A1 (ja) | 光検出装置及び測距システム | |
US12117313B2 (en) | Photodetection device and photodetection system | |
WO2024181041A1 (ja) | 測距装置及び測距方法 | |
WO2023276223A1 (ja) | 測距装置、測距方法及び制御装置 | |
US20240241227A1 (en) | Distance measuring device and distance measuring method | |
US20240272285A1 (en) | Light source control device, light source control method, and distance measuring device | |
WO2023145529A1 (ja) | 情報処理装置、情報処理方法および情報処理プログラム | |
US20240179429A1 (en) | Solid-state imaging device, imaging device, processing method in solid-state imaging device, processing program in solid-state imaging device, processing method in imaging device, and processing program in imaging device | |
WO2024024471A1 (ja) | 情報処理装置、情報処理方法、及び、情報処理システム | |
WO2023223928A1 (ja) | 測距装置及び測距システム | |
WO2024185361A1 (ja) | 固体撮像装置 | |
WO2023149089A1 (ja) | 学習装置、学習方法及び学習プログラム | |
WO2024009739A1 (ja) | 光学式測距センサ、及び光学式測距システム | |
WO2024106196A1 (ja) | 固体撮像装置および電子機器 | |
US20240241235A1 (en) | Light detecting device and distance measuring system | |
WO2024062842A1 (ja) | 固体撮像装置 | |
WO2023063145A1 (ja) | 情報処理装置、情報処理方法および情報処理プログラム | |
WO2023074419A1 (ja) | 情報処理装置、情報処理方法、及び、情報処理システム | |
WO2023162497A1 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
WO2023281824A1 (ja) | 受光装置、測距装置及び受光装置の制御方法 | |
WO2023047666A1 (ja) | 情報処理装置、および情報処理方法、並びにプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22815597 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023525401 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18555485 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280038296.9 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022815597 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022815597 Country of ref document: EP Effective date: 20240104 |