CN112068149B - Sensing device and method for sensing - Google Patents
Sensing device and method for sensing Download PDFInfo
- Publication number
- CN112068149B CN112068149B CN202010521767.4A CN202010521767A CN112068149B CN 112068149 B CN112068149 B CN 112068149B CN 202010521767 A CN202010521767 A CN 202010521767A CN 112068149 B CN112068149 B CN 112068149B
- Authority
- CN
- China
- Prior art keywords
- pri
- pulses
- response
- processing
- sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000012545 processing Methods 0.000 claims abstract description 57
- 230000005855 radiation Effects 0.000 claims abstract description 57
- 230000004044 response Effects 0.000 claims abstract description 33
- 230000003287 optical effect Effects 0.000 claims abstract description 32
- 230000008569 process Effects 0.000 claims abstract description 10
- 230000002452 interceptive effect Effects 0.000 claims description 7
- 230000008859 change Effects 0.000 claims description 5
- 101150103950 priS gene Proteins 0.000 description 17
- 101100353168 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) PRI1 gene Proteins 0.000 description 15
- 101100353177 Schizosaccharomyces pombe (strain 972 / ATCC 24843) spp2 gene Proteins 0.000 description 14
- 101150047682 priL gene Proteins 0.000 description 14
- 101100353178 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) PRI2 gene Proteins 0.000 description 12
- 238000013507 mapping Methods 0.000 description 10
- 230000001413 cellular effect Effects 0.000 description 9
- 238000010295 mobile communication Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 238000001208 nuclear magnetic resonance pulse sequence Methods 0.000 description 6
- 101710131167 Ribose-5-phosphate isomerase A 2 Proteins 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 101710131169 Ribose-5-phosphate isomerase A 1 Proteins 0.000 description 3
- 230000010267 cellular communication Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 239000000758 substrate Substances 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000035559 beat frequency Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000007717 exclusion Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000007429 general method Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000003530 single readout Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/46—Indirect determination of position data
- G01S17/48—Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
- G01S17/26—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein the transmitted pulses use a frequency-modulated or phase-modulated carrier wave, e.g. for pulse compression of received signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/484—Transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
- G01S7/4815—Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4865—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
- G01S7/4876—Extracting wanted echo signals, e.g. pulse detection by removing unwanted signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/523—Details of pulse systems
- G01S7/524—Transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/523—Details of pulse systems
- G01S7/526—Receivers
- G01S7/527—Extracting wanted echo signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/523—Details of pulse systems
- G01S7/526—Receivers
- G01S7/53—Means for transforming coordinates or for evaluating data, e.g. using computers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01S—DEVICES USING THE PROCESS OF LIGHT AMPLIFICATION BY STIMULATED EMISSION OF RADIATION [LASER] TO AMPLIFY OR GENERATE LIGHT; DEVICES USING STIMULATED EMISSION OF ELECTROMAGNETIC RADIATION IN WAVE RANGES OTHER THAN OPTICAL
- H01S5/00—Semiconductor lasers
- H01S5/10—Construction or shape of the optical resonator, e.g. extended or external cavity, coupled cavities, bent-guide, varying width, thickness or composition of the active region
- H01S5/18—Surface-emitting [SE] lasers, e.g. having both horizontal and vertical cavities
- H01S5/183—Surface-emitting [SE] lasers, e.g. having both horizontal and vertical cavities having only vertical cavities, e.g. vertical cavity surface-emitting lasers [VCSEL]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
- G01J1/44—Electric circuits
- G01J2001/4446—Type of detector
- G01J2001/446—Photodiode
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/10—Details of telephonic subscriber devices including a GPS signal receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Aviation & Aerospace Engineering (AREA)
Abstract
The present disclosure relates to selection of pulse repetition intervals for sensing time of flight. The sensing device comprises a radiation source that emits pulses of optical radiation towards a plurality of points in a target scene. The receiver receives optical radiation reflected from the target scene and outputs a signal indicative of a respective time of flight of the pulse to and from the point in the target scene. Processing and control circuitry selects a first Pulse Repetition Interval (PRI) and a second PRI greater than the first PRI from an allowable range of PRIs, drives a radiation source to transmit a first sequence of the pulses at the first PRI and a second sequence of the pulses at the second PRI, and processes signals output in response to both the first sequence and the second sequence of the pulses to calculate respective depth coordinates of the point in the target scene.
Description
Technical Field
The present invention relates generally to systems and methods for depth mapping, and in particular to beam sources used in time of flight (ToF) sensing.
Background
Existing and emerging consumer applications have created an increasing demand for real-time three-dimensional (3D) imagers. These imaging devices, also known as depth sensors or depth mappers, enable remote measurement of distance (and often also intensity) to each point in the target scene, known as target depth of field, by illuminating the target scene with an optical beam and analyzing the reflected optical signal. A common technique for determining the distance to each point on the target scene involves transmitting one or more pulsed optical beams towards the target scene and then measuring the round trip time, the time of flight (ToF), that the optical beams take when traveling from the source to the target scene and back to the detector array adjacent to the source.
Some ToF systems use Single Photon Avalanche Diodes (SPADs), also known as geiger-mode avalanche photodiodes (GAPDs), in measuring photon arrival times.
Disclosure of Invention
Embodiments of the invention described below provide improved depth mapping systems and methods of operation of such systems.
Thus, according to one embodiment of the invention there is provided a sensing device comprising a radiation source configured to emit pulses of optical radiation towards a plurality of points in a target scene. The receiver is configured to receive optical radiation reflected from the target scene and to output a signal indicative of a respective time of flight of the pulses to and from the point in the target scene in response to the received optical radiation. The processing and control circuitry is configured to select a first Pulse Repetition Interval (PRI) and a second PRI greater than the first PRI from a range of allowed PRIs, and to drive the radiation source to transmit a first sequence of the pulses at the first PRI and a second sequence of the pulses at the second PRI, and to process signals output by the receiver in response to both the first sequence and the second sequence of the pulses so as to calculate respective depth coordinates of the point in the target scene.
In the disclosed embodiments, the radiation source comprises an array of Vertical Cavity Surface Emitting Lasers (VCSELs). Additionally or alternatively, the radiation source comprises an array of emitters arranged in a plurality of groups, and the processing and control circuitry is configured to sequentially drive the plurality of groups such that each group emits a respective first and second sequence of the pulses at a first and second PRI. Additionally or alternatively, the sensing element includes a Single Photon Avalanche Diode (SPAD).
In some embodiments, the first PRI defines a range limit at which the time of flight of the pulses is equal to the first PRI, and the processing and control circuitry is configured to compare signals output by the receiver in response to the first and second sequences of pulses so as to distinguish points in the scene having respective depth coordinates less than the range limit from points in the scene having respective depth coordinates greater than the range limit, thereby resolving distance folding of the depth coordinates. In one such embodiment, the processing and control circuit is configured to calculate, for each point in the scene, respective first and second histograms of time of flight of pulses in the first and second sequences, and detect that a distance fold has occurred at a given point in response to a difference between the first and second histograms.
In some embodiments, the apparatus includes one or more radio transceivers that communicate over the air by receiving signals in at least one allocated frequency band, wherein the processing and control circuitry is configured to identify the allowed PRI range responsive to the allocated frequency band. In general, the allowed range is defined such that PRIs in the allowed range have no harmonics within the allocated frequency band. Additionally or alternatively, the processing and control circuitry is configured to modify the allowed range in response to a change in the allocated frequency band of the radio transceiver, and to select a new value for one or both of the first PRI and the second PRI such that the new value falls within the modified range.
In the disclosed embodiments, the processing and control circuitry is configured to store records of multiple sets of PRIs, identify an operating environment of the device, and select one of the sets to apply to driving the radiation source in response to the identified operating environment. The processing and control circuitry may be configured to select the one of the groups in response to a geographical area in which the device is operating. Additionally or alternatively, the groups of PRIs have respective priorities assigned in response to a likelihood of interfering with frequencies used by the radio transceiver, and the processing and control circuitry is configured to select the one of the groups in response to the respective priorities. In one embodiment, PRIs in each group are mutually exclusive relative to other PRIs in the group.
In another embodiment, the processing and control circuit is configured to select a third PRI greater than the second PRI from the allowed PRI ranges, and to drive the radiation source to emit a third sequence of the pulses at the third PRI, and to process signals output by the receiver in response to the first, second and third sequences of pulses, so as to calculate the respective depth coordinates of the point in the target scene.
Additionally or alternatively, the processing and control circuitry is configured to select the first PRI and the second PRI to maximize the range of depth coordinates while maintaining the resolution of the depth coordinates no greater than a predefined resolution limit.
There is also provided, in accordance with an embodiment of the present invention, a sensing method including selecting a first Pulse Repetition Interval (PRI) and a second PRI greater than the first PRI from a range of allowed PRIs. The radiation source is driven to emit a first sequence of pulses of optical radiation at a first PRI and a second sequence of pulses of said optical radiation at a second PRI towards each of a plurality of points in the target scene. Optical radiation reflected from the target scene is received, and a signal indicative of a respective time of flight of the pulses to and from the point in the target scene is output in response to the received optical radiation. Signals are processed in response to both the first and second sequences of pulses in order to calculate respective depth coordinates of the point in the target scene.
The invention will be more fully understood from the following detailed description of embodiments of the invention, taken together with the accompanying drawings, in which:
Drawings
FIG. 1 is a block diagram schematically illustrating a mobile communication device with a depth map camera according to an embodiment of the present invention;
FIG. 2 is a schematic side view of a depth map camera according to an embodiment of the invention;
FIG. 3 is a schematic front view of an emitter array that may be used in a depth map camera according to an embodiment of the present invention;
fig. 4A, 4B, 4C, and 4D are graphs schematically illustrating timing signals in a depth map camera according to an embodiment of the present invention;
FIG. 5 is a block diagram schematically illustrating the operation of a depth map camera using multiple groups of transmitters, according to an embodiment of the present invention;
FIG. 6 is a graph schematically illustrating harmonic frequencies of a pair of pulse repetition intervals selected with reference to a cellular communication band in accordance with an embodiment of the present invention;
FIG. 7 is a flow chart schematically illustrating a method for selecting a pulse repetition interval for use in depth mapping, in accordance with an embodiment of the present invention; and
Fig. 8 is a flow chart schematically illustrating a method for selecting a pulse repetition interval for use in depth mapping according to another embodiment of the present invention.
Detailed Description
SUMMARY
Embodiments of the present invention provide a ToF-based depth sensing device wherein a radiation source emits pulses of optical radiation towards a plurality of points in a target scene. (in the context of the present specification and claims, the term "optical radiation" is used interchangeably with the term "light," meaning electromagnetic radiation in any of the visible, infrared, and ultraviolet spectral ranges.) a receiver receives optical radiation reflected from a target scene and outputs a signal indicative of the respective time of flight of the pulses to and from the point in the target scene. The processing and control circuit drives the radiation source and processes the signals output by the receiver to calculate the corresponding depth coordinates of the points in the target scene.
Such devices often suffer from low signal-to-noise ratio (SNR). To improve the SNR, processing and control circuitry collects and analyzes signals from the receiver over a sequence of pulses emitted by the radiation source. In some cases, the processing and control circuitry calculates a histogram of the time of flight of the pulse train reflected from each point in the target scene and uses an analysis of the histogram (e.g., a pattern of histograms at each point) as an indication of the corresponding depth coordinates. To generate and output depth coordinates at a reasonable frame rate (e.g., 30 frames/second), it is desirable for the radiation source to transmit a pulse sequence at a high Pulse Repetition Frequency (PRF) or equivalently at a low Pulse Repetition Interval (PRI) as signals are collected over a sequence of multiple pulses. For example, the radiation source may output pulses of about 1ns duration at a PRI of 40ns-50 ns.
However, using a short PRI creates a distance folding problem: since the optical radiation propagates at a speed of about 30cm/ns, when pulses emitted in a sequence with a PRI of 40ns are reflected from an object more than about 6m from the device, the reflected radiation will reach the receiver only after the radiation source has emitted the next pulse. The processing and control circuitry will then not be able to determine whether the received radiation originated from the most recently transmitted pulse due to reflection from a nearby object or from a pulse transmitted earlier in the sequence due to a distant object. Thus, the PRI effectively defines a range limit that is proportional to the PRI and sets an upper limit on the distance of objects that can be sensed by the device.
Embodiments of the present invention address this problem by using two or more different PRIs in series. The processing and control circuitry selects the first PRI and the second PRI from (at least) the allowed PRI range. The radiation source is then driven to emit a first sequence of said pulses at a first PRI and a second sequence of said pulses at a second PRI, and signals output by the receiver in response to both the first and second sequences of pulses are processed to calculate depth coordinates of said points in the target scene. The sequences may be transmitted one after the other, or they may be interleaved, with pulses transmitted alternately at the first PRI and the second PRI. Although the embodiments described below primarily refer to only a first PRI and a second PRI for simplicity, the principles of these embodiments can be readily extended to three or more different PRIs.
More specifically, to resolve and disambiguate possible range folds, processing and control circuitry compares signals output by the receiver in response to the first sequence of pulses with signals output in response to the second sequence to distinguish points in the scene where the respective depth coordinate is less than a range limit defined by the PRI from points in the scene where the respective depth coordinate is greater than the range limit. For example, the processing and control circuitry may calculate respective first and second histograms of time of flight of pulses in the first and second sequences for each point in the scene. For objects closer than the range limit, the two histograms will be approximately the same. However, objects outside the limits of the range will produce different histograms in response to different PRIs of the first pulse train and the second pulse train. Thus, the processing and control circuit is able to detect that a distance fold has occurred at each point in the target scene based on the similarity or difference between the first histogram and the second histogram at each point.
Another problem arises when the depth sensing device is incorporated into a mobile communication device such as a smart phone: a mobile communication device includes at least one radio transceiver (and often a plurality of radio transceivers) that communicate over the air by receiving signals in an allocated frequency band, such as one of the frequency bands defined by the ubiquitous LTE standard for cellular communications. Furthermore, the allocated frequency bands will often change as the device roams from one cell to another. At the same time, the sequence of short, high current pulses used to drive the radiation source at a high PRF produces harmonics, some of which may fall within the allocated frequency band of the transceiver. Noise due to these harmonics can severely degrade the SNR of the radio transceiver.
To overcome this problem, in some embodiments of the invention, the processing and control circuitry identifies an allowed PRI range in a manner that avoids interfering with the allocated frequency band of the radio transceiver, and selects a first PRI value and a second PRI value to be within the allowed range. In other words, the allowed range is preferably defined such that PRI in the allowed range has no harmonics within the allocated frequency band. The allowed range may be defined, for example, as a list of allowed PRI values, or a set of intervals within which PRI values may be selected. Additionally or alternatively, groups of two or more PRIs may be predefined and stored in a record maintained by the device. The appropriate group may then be identified and used according to the radio operating environment of the device.
When the allocated frequency band of the radio transceiver changes, the processing and control circuitry will modify the allowed range or set of PRI values accordingly. If necessary, the processing and control circuitry will select a new value for one or all PRIs so that the new value falls within the modified range. With these range constraints, the PRI may be selected by applying predefined optimization criteria, for example, to maximize the range of depth coordinates while maintaining the resolution of the depth coordinates at a value no greater than a predefined resolution limit.
As previously mentioned, although for simplicity some embodiments described herein relate to a scenario using two PRI values, the principles of the present invention may be similarly applied to selecting and using three or more PRI values. In terms of enhancing the range and resolution of the depth map, it may be useful to use a greater number of PRI values within different portions of the allowed range.
System description
Fig. 1 is a block diagram schematically illustrating a mobile communication device 10 including a ToF-based depth camera 20 in accordance with an embodiment of the present invention. Device 10 is illustrated in fig. 1 as a smart phone, but the principles of the present invention are similarly applicable in any kind of device that performs both radio communication (typically, but not exclusively, through a cellular network) and high pulse repetition rate optical sensing.
The device 10 comprises a plurality of radio transceivers 12 which transmit radio signals to and/or receive radio signals from a respective network. For example, for an LTE cellular network, the radio signal may be in any of a number of different frequency bands, typically in the range between 800MHz and 3000MHz, depending on the region and type of service. As the device 10 roams, its frequency bands at which it transmits and receives signals will typically change. The frequency controller 14 in the device 10 selects the frequency that the transceiver 12 uses in radio communications at any given time.
The camera 20 senses depth by outputting an optical pulse train towards a target scene and measuring the time of flight of pulses reflected back from the scene to the camera. Details of the structure and operation of the camera 20 are described with reference to the following figures.
The generation of optical pulses emitted from the camera 20 creates significant electrical noise within the device 10 both at the pulse repetition frequency (PRF, which is the pulse repetition interval or inverse of PRI) of the camera 20 and at the harmonics of the PRF. To avoid interfering with the operation of transceiver 12, frequency controller 14 provides camera 20 with a current range of allowed PRIs whose harmonics fall well outside the frequency band over which transceiver 12 is currently transmitting and receiving. (alternatively, the frequency controller may inform the camera of the frequency band on which the transceiver is currently transmitting and receiving, and the camera may itself derive the current range of allowed PRIs on this basis.) this range may have the form, for example: a list of allowed PRIs (or equivalently, PRFs), or a set of intervals in which PRIs (or PRFs) may be selected. The camera 20 selects a pair of PRIs, or possibly three or more PRIs, from the allowed range that will give the best depth mapping performance, while thus avoiding interfering with the communication of the device 10. Details of the criteria and procedure for selection are described below.
Fig. 2 is a schematic side view of a depth camera 20 according to an embodiment of the invention. The camera 20 includes a radiation source 21 that emits M individual pulsed beams (e.g., M may be about 500). The radiation sources include emitters arranged in a two-dimensional array 22, which may be grouped into groups (as shown in detail in fig. 3), and beam optics 37. Emitters typically comprise solid state devices such as Vertical Cavity Surface Emitting Lasers (VCSELs) or other types of lasers or Light Emitting Diodes (LEDs). The transmitter is driven by the controller 26 to transmit optical pulses at two different PRIs, as described further below.
Beam optics 37 typically include a collimating lens and may include a Diffractive Optical Element (DOE) that replicates the actual beam emitted by array 22 to produce M beams projected onto scene 32. (for example, an array of four groups of 16 VCSELs with a 4x 4 arrangement in each group may be used to create an 8 x 8 beam, and the DOE may divide each beam into 3 x 3 copies to give a total of 24 x 24 beams.) for simplicity, these internal elements of beam optics 37 are not shown.
The receiver 23 in the camera 20 comprises a two-dimensional detector array, such as a SPAD array 24, together with J processing units 28 and a select line 31 for coupling the processing units to the SPADs. The combining unit 35 passes the digital output of the processing unit 28 to the controller 26.SPAD array 24 includes a plurality of detector elements N, which may be equal to M or may be much larger than M, such as 100 x 100 pixels or 200 x 200 pixels. The number J of processing units 28 depends on the number of pixels of SPAD array 24 to which each processing unit is coupled.
The array 22 emits M pulsed beams 30 of light that are directed toward a target scene 32 by beam optics 37. Although beams 30 are depicted in fig. 2 as parallel beams of constant width, each beam diverges as indicated by diffraction. In addition, the beams 30 diverge from one another to cover a desired area of the scene 32. The scene 32 reflects or otherwise scatters those beams 30 incident on the scene. The reflected and scattered beams are collected by objective optics 34, represented by lenses in fig. 2, which form an image of the scene 32 on the array 24. Thus, for example, a small region 36 on the scene 32 on which the beam 30a has been incident is imaged onto a small region 38 on the SPAD array 24.
The cartesian coordinate system 33 defines the orientation of the depth camera 20 and the scene 32. The x-axis and y-axis are oriented in the plane of SPAD array 24. The z-axis is perpendicular to the array and points to the scene 32 imaged onto the SPAD array 32.
For clarity, the processing units 28 are illustrated as if separate from the SPAD array 24, but they are typically integrated with the SPAD array. Similarly, the combining unit 35 is typically integrated with the SPAD array 24. The processing unit 28, together with the combining unit 35, comprises hardware amplification and logic circuitry that senses and records pulses output by SPADs in corresponding pixels or groups of pixels (referred to as "super-pixels"). Thus, these circuits measure the arrival time of the photons producing the pulses, as well as the intensity of the optical pulses incident on SPAD array 24.
Processing unit 28, together with combining unit 35, may assemble one or more histograms of the arrival times of the plurality of pulses transmitted by array 22, and thus output signals indicative of distance from corresponding points in scene 32, as well as signal strength. Circuits useful for this purpose are described, for example, in U.S. patent application publication 2017/0176579, the disclosure of which is incorporated herein by reference. Alternatively or in addition, some or all of the components of the processing unit 28 and the combining unit 35 may be separate from the SPAD array 24 and may be integrated with the controller 26, for example. For the sake of generality, the controller 26, the processing unit 28 and the combining unit 35 are collectively referred to herein as "processing and control circuitry".
The controller 26 is coupled to both the radiation source 21 and the receiver 23. The controller 26 alternately drives groups of transmitters in the array 22 at the appropriate PRI to transmit pulsed beams. The controller also provides external control signals 29 to the processing and combining units in the receiver 23 and receives output signals from the processing and combining units. The output signal may include histogram data and may be used by the controller 26 to derive both the time of incidence and the signal strength. The controller 26 calculates the time of flight of the M beams from the timing of the emission of the beams 30 by the VCSEL array 22 and the times of arrival measured by the M processing units 28, mapping the distances from the corresponding M points in the scene 32.
In some embodiments, to best utilize the available sensing and processing resources, the controller 26 identifies the respective areas of the SPAD array 24 to which pulses of optical radiation reflected from the respective areas of the target scene 32 are imaged by the lens 34, and selects super pixels to correspond to those areas. The signals output by sensing elements outside of these regions are not used and therefore these sensing elements can be deactivated, for example by reducing or turning off the bias voltages of these sensing elements.
The dimensions of the emitter array 22 and SPAD array 24 are exaggerated in fig. 2 relative to the scene 32 for clarity. The lateral separation (referred to as "baseline") of the emitter array 22 and SPAD array 24 is actually much smaller than the distance from the emitter array 22 to the scene 32. Thus, the chief ray 40 (the ray passing through the center of the objective optic 34) from the scene 32 to the SPAD array 24 is nearly parallel to the ray 30, resulting in only a small amount of parallax.
The controller 26 typically includes a programmable processor that is programmed in software and/or firmware to perform the functions described herein. Alternatively or in addition, the controller 26 includes hardwired and/or programmable hardware logic circuitry that performs at least some of the functions of the controller. Although for simplicity the controller 26 is illustrated in the figures as a single monolithic functional block, in practice the controller may comprise a single chip or a set of two or more chips with suitable interfaces for receiving and outputting signals as shown in the figures and described herein.
One of the functional units of the controller 26 is a Depth Processing Unit (DPU) 27, which receives and processes signals from two processing units 28. DPU 27 calculates the time of flight of photons in each beam 30 and thus maps the distance to the corresponding point in target scene 32. This mapping is based on the timing of the emission of beam 30 by emitter array 22 and the time of arrival (i.e., the time of incidence of the reflected photons) measured by processing unit 28. The DPU 27 uses the histograms accumulated at two different PRIs of the transmitter array 22 to disambiguate any "range folding" that may occur, as explained below with reference to FIG. 4. The controller 26 typically stores the depth coordinates in memory and may output a corresponding depth map for display and/or further processing.
Fig. 3 is a schematic front view of an emitter array 22 in a beam source 21 according to one embodiment of the present invention. The four groups 62a, 62b, 62c, and 62d of vertical emitters 54 are staggered into alternating vertical stripes on a substrate 64, such as a semiconductor chip. Each group includes a plurality of stripes that alternate with stripes in the other groups on the substrate. Alternatively, other interleaving schemes may be used. The emitter 54 emits a respective beam 30 towards optics 37 which collimate the beam and project it towards the target scene. In a typical implementation, the emitter 54 comprises a VCSEL driven by an electrical pulse about 1ns wide, having sharp rising and falling edges and having a peak pulse current exceeding 1A, and having a PRI of about 40 ns. Alternatively, other timing and current parameters may be used, depending on application requirements.
To enable selection and switching between different groups, the array 22 may be mounted on a driver chip (not shown), for example a silicon chip with CMOS circuitry for selecting and driving individual VCSELs or groups of VCSELs. In this case, the VCSEL groups may be physically separated for ease of fabrication and control, or they may be staggered on the VCSEL chip with appropriate connections to the driver chip to enable alternating actuation of the groups. Thus, the beams 30 illuminate the target scene also in a time division multiplexed mode, wherein different sets of beams are incident on respective areas of the scene at different times.
As a further alternative to the illustrated embodiment, the array 22 may include a greater or lesser number of groups and emitters. Typically, to adequately cover the target scene with a static (non-scanning) beam, the array 22 includes at least four groups 62, each having at least four emitters 54 therein, and possibly a DOE for dividing the radiation emitted by each emitter. For denser coverage, the array 22 includes at least eight groups, with twenty or more transmitters in each group. These options enhance the flexibility of the camera 20 in terms of optical and electrical power budgets and time division multiplexing of processing resources.
PRI selection and control
Fig. 4A is a graph schematically illustrating timing signals in the depth camera 20 according to an embodiment of the present invention. In this example, the controller 26 has selected two different PRIs: pri1=40 ns, and pri2=44 ns. The controller drives the transmitter 54 to transmit the pulse train of PRI1 alternately with the pulse train of PRI 2. PRI1 produces a range limit 75 of about 6m from camera 20. In the embodiment shown in this figure, the time of flight of pulses of PRI1 and PR2 is measured by a time-to-digital converter (TDC) having the same slope for both PRIs, meaning that the time resolution is the same at both PRIs.
In the illustrated scenario, radiation source 22 transmits pulses 70. An object at a close distance (e.g., 2.4m from the camera 20) returns a reflected pulse 72 that reaches the receiver 23 after a ToF of 16 ns. To measure ToF, receiver 23 counts the time elapsed between receipt of transmitted pulse 70 and reflected pulse 72. The count value of the pulse train of PRI1 is represented in fig. 4A by a first serration 76, while the value of the pulse train of PRI2 is represented by a second serration 78. The two sawtooth waveforms are identical in shape, but there is an offset in the spacing between successive waveforms due to the difference in PRI. Thus, for reflected pulse 72, the same ToF of 16ns will be measured in both pulse sequences at PRI1 and PRI 2. For the purpose of TDC resynchronization, each serration in this embodiment is followed by a reset period.
On the other hand, an object whose distance exceeds the range limit (e.g., 8.4m from the camera 20) will return a reflected pulse 74 that reaches the receiver 23 after a ToF of 56 ns. Thus, pulse 74 reaches the receiver after radiation source 22 has transmitted the next pulse in the sequence, and after the counters represented by saw teeth 76 and 78 have been zeroed. Thus, receiver 23 will record a 16ns ToF for pulse 74 during the sequence of PRI 1. However, since PRI is large during the sequence of PRI2, the receiver will record a 12ns ToF during this sequence.
Fig. 4B and 4C are graphs schematically illustrating timing signals in the depth camera 20 according to other embodiments of the present invention. These timing schemes are similar in operation to the timing scheme described above with reference to fig. 4A, but do not use a TDC reset period. In the embodiment of fig. 4B, the TDC uses the same count resolution (the same maximum count number) for both the serrations 76 of PRI1 and the longer serrations 78 of PRI 2. This approach is advantageous in reducing the number of memory bins used in histogram accumulation. In the embodiment of fig. 4C, TDC detects photon arrival times at both PRI1 and PRI2 with the same absolute time resolution, which means that depth resolution can be enhanced, but at the cost of using a greater number of histogram bins.
When processing ToF results in any of the above schemes, the controller 26 will detect that a certain point in the scene 32 has two different ToF values during the pulse sequence of the two different PRI values. These two different ToF values are separated by a difference (4 ns) between the PRI values. In this way, the controller 26 can detect that a distance fold has occurred at this point. Thus, the controller distinguishes points in the scene where the corresponding depth coordinate is less than the range limit 75 from points in the scene where the corresponding depth coordinate is greater than the range limit, thereby addressing distance folding of the depth coordinate. Based on the difference between the PRI values (or equivalently, the beat frequency of the PRF values), the controller 26 may be able to distinguish between different multiples of the range limit 75, and thus extend the detection range even further.
Fig. 4D is a graph schematically illustrating timing signals in the depth camera 20 according to another embodiment of the present invention. In this example, the controller 26 selects three or more different PRIs: PRI1, PRI2, PRI3, PRIk. The controller drives the radiation source 22 to sequentially alternate delivery of pulses 70 at different PRIs. In this case, it is assumed that the object is at a distance D (corresponding to the time of flight T) exceeding the folding limit such that each reflected pulse 72 reaches the receiver 23 after the next transmitted pulse 70 has been transmitted. In other words, each reflected pulse j arrives at the receiver at a time Tj after pulse j+1 has been transmitted. The actual distance from the object (in terms of time of flight) is given by: t=t1+pri1=t2+pri2=t3+pri3.
The receiver 23 may experience a "dead zone" immediately after each transmitted pulse 70, where the receiver is unable to detect pulses reflected from the target object due to scattered reflections within the camera 20. This dead zone is exemplified by the last reflected pulse k in fig. 4D, which reaches the receiver 23 a small time Tk after the next transmitted pulse. Therefore, the controller 26 may ignore signals output by the receiver in the dead zone. As in the present example, the use of three or more different PRIs is advantageous in avoiding ambiguity due to such dead zones and improving the accuracy of depth measurements. When three or more different PRIs are used, it is also possible to select a relatively short PRI value without risk of ambiguity due to distance folding and dead zones, thereby increasing the frequency of harmonics and reducing the probability of interfering with the radio transceiver 12 (fig. 1).
Advantageously, the PRI values used together in the category of schemes shown in fig. 4D are selected such that each PRI (PRI 1, PRI2, PRI3,) is prime relative to the other PRIs (meaning that the greatest integer common divisor of any pair of PRIs is 1). In this case, a given set of measurements T1, T2, T3, according to the chinese remainder theorem, will have exactly one unique solution T, as in the formula given above, regardless of the distance folding.
Fig. 5 is a block diagram schematically illustrating the operation of a camera 20 using multiple groups of transmitters, such as group 62 of transmitters 54 in fig. 3, according to an embodiment of the invention. The controller 26 sequentially drives the groups 62 such that each group sequentially transmits a corresponding pulse sequence at the two different PRIs. Alternatively, the principles of this embodiment may be extended to drive group 62 to transmit pulses at three or more different PRIs.
Fig. 5 illustrates the operation of the camera 22 on a single frame 80 (e.g., a period of 33 ms) of the depth map generation. The frame 80 is divided into eight segments 82, 84, having two consecutive segments assigned to each of the four groups 62a, 62b, 62c and 62 d. During the first segment 82, a first group (e.g., group 62 a) transmits a sequence of pulses at intervals given by PRI1, followed by a sequence of pulses at intervals given by PRI2 in segment 84. During each such segment, receiver 23 generates a ToF histogram for pixels (or super-pixels) in SPAD array 24 that have received pulses reflected from points in target scene 32. Thus, controller 26 receives two ToF histograms in each frame 80, relative to the points in scene 32 illuminated by group 62 a. Controller 26 may then compare the two histograms to disambiguate any distance folds, as described above, and thus convert the ToF histogram to accurate depth coordinates.
This process of transmitting pulse sequences in PRI1 and PRI2 is repeated for each of the other groups 62b, 62c, and 62d, so the controller 26 receives the double histogram and extracts the depth coordinates of the corresponding pixel group in the receiver 23. The controller 26 combines the depth coordinates generated on all four groups of transmitters to create and output a complete depth map of the scene 32.
Although fig. 5 illustrates capture of a data frame by collecting histograms at PRI1 and PRI2, respectively, in an alternative embodiment, toF data may be collected in the same histogram in a concurrent or alternating manner, the histogram being read out only once for each group (e.g., following block 84 in fig. 5). The histogram will in this case comprise peaks corresponding to two PRI values. As in the embodiment of fig. 5, the use of two histogram readouts provides more information in the sense that histogram data can be explicitly assigned to a particular PRI. Furthermore, due to the shorter exposure time, the ToF data is less affected by environmental noise, thereby improving distance fold detectability. On the other hand, the additional readout required in this case consumes time and thus reduces photon count relative to using a single readout for both PRIs.
Fig. 6 is a graph schematically illustrating harmonic frequencies 92, 94 of a pair of pulse repetition intervals selected with reference to a cellular communication band 90, in accordance with an embodiment of the present invention. In this example, the frequency controller 14 has received an allocation of the frequency band 90 in the frequency spectrum and has output to the camera 20 a list of allowed PRI values that will not generate any harmonics within the frequency band 90. The controller 26 selects pulse rate intervals PRI1 and PRI2 from a list of allowed values. Accordingly, the harmonic frequencies 92 and 94 fall entirely outside the frequency band 90, thereby minimizing any potential interference with the performance of the transceiver 12.
Fig. 7 is a flow chart schematically illustrating a method for selecting a pulse repetition interval for use in depth mapping, according to an embodiment of the present invention. For clarity and brevity, the method is described with reference to components of the apparatus 10 (FIG. 1). However, the principles of the method may be similarly applied in other devices that combine the functions of radio communication and pulse range measurement.
The frequency controller 14 identifies the radio frequency band over which the transceiver 12 is to communicate, at a frequency allocation step 100. Based on the allocation, the frequency controller calculates a list of PRIs without harmonics in the allocated radio frequency band, at a PRI list generation step 102. Alternatively, the frequency controller 14 may calculate and output the available PRI value. Alternatively, the frequency controller 14 may communicate the allocation of radio bands to the camera 20, and the controller 26 may then calculate a list of available PRIs.
The controller 26 of the camera 20 selects a first PRI value (PRI 1) from the list in a first PRI selection step 104 and a second PRI value (PRI 2) in a second PRI selection step 106. Optionally, controller 26 may select one or more additional PRI values up to PRIk, at a further PRI selection step 107. The controller may apply any suitable optimization criteria in selecting the PRI value. For example, the controller 26 may select PRI1 and PRI2 to optimize SNR and maximize the range of depth coordinates that can be measured by the camera 20 while maintaining the resolution of the depth coordinates no greater than (i.e., no worse than) the predefined resolution limit. Criteria applicable in this regard include:
The PRI variance is optimized to maximize detectability and measurability over a long range while setting the lower limit of PRI according to the minimum range to be covered.
Optimizing compatibility with the radio transceiver 12, including:
ensure that transceiver 12 is not disturbed.
Using a priori knowledge of the cellular frequency channel variation (based on the probability of channel usage in certain circumstances, if possible).
PRI is chosen to cover a wide variety of wireless channels from a statistical perspective in order to minimize potential reassignment of PRI.
PRI, which avoids the need to switch internal synchronization circuitry within the device 10 when possible, is used in order to avoid acquisition delays and timing variations that may introduce transient depth errors.
Step size is minimized whenever it is desired to change the internal synchronization frequency in order to minimize transient depth errors.
The above criteria are given by way of example only, and alternative optimization criteria may be used, depending on the system design and operating environment. A systematic method for selecting a set of PRI values that can be advantageously used together is described below with reference to fig. 8.
After selecting the PRI value, camera 20 captures depth data by sequentially transmitting a pulse sequence with PRI1 and PRI2, as described above, at a depth mapping step 108.
The camera 20 generally continues to operate at the selected pair of PRI values until the frequency controller 14 allocates a new frequency band for communication by the transceiver 12 at a new frequency allocation step 110. In this case, the method returns to step 100, where the frequency controller 14 modifies the allowed PRI range of the camera 20. The controller 26 will then select new values for one or both of PRI1 and PRI2 such that these new values fall within the modified range. The operation of the camera 20 continues to use these new values.
Fig. 8 is a flow chart schematically illustrating a method for selecting a pulse repetition interval for use in depth mapping according to another embodiment of the present invention. As previously mentioned, the principles of this particular method may be integrated into the more general method described above.
Specifically, the method of fig. 8 constructs sets of mutually compatible PRI values and enables the controller 26 to store a record of those sets. (the method may be performed by the controller 26 itself, but more typically the record may be generated offline by a general purpose computer and then stored in memory accessible to the controller.) at any given time, the controller 26 may then identify the operating environment of the mobile communication device 10, such as the geographical area in which the device is operating, and may select the PRI set to be applied in driving the radiation source 21 based on the characteristics of the operating environment. For example, if the controller 26 finds that certain frequency bands are often used for radio communications in a given environment, the controller may then select a set of PRIs to reduce the likelihood of interfering with those frequency bands. The controller may derive this information, for example, by analyzing the operation of transceiver 12 or based on information received from an external source.
In typical use, the mobile communication device 10 includes multiple transceivers operating in parallel in different frequency bands, such as a Global Positioning System (GPS) operating in the range of 1.5GHz-1.7 GHz; wireless local area networks (Wi-Fi) operating on channels around 2.4GHz and 5 GHz; and various cellular bands between 600MHz and 5 GHz. In selecting the set of PRI values, the controller 26 may give priority to certain frequencies, thereby avoiding PRI with harmonics in the high priority radio frequency band. For example, because the GPS signal is weak and a sensitive receiver is required, the GPS band will have a high priority. The cellular channels used in critical signaling may also be prioritized for lower range cellular frequencies that are more susceptible to interference. Wi-Fi channels may have a lower priority so long as there is at least one Wi-Fi channel that is interference free for any given set of PRI values.
In the method of fig. 8, the computer begins by compiling a list of all possible PRI values that can be supported by the camera 20 in a PRI compiling step 120. Then, at a PRI exclusion step 122, the computer excludes certain PRI values whose harmonics tend to cause electromagnetic interference (EMI) at certain target radio frequencies. For example, the computer may exclude:
PRI values with harmonics in the GPS band.
With PRI values that will interfere with harmonics of the critical cellular channel and/or the high priority cellular frequency band.
With PRI values that will interfere with harmonics of the entire frequency band of the Wi-Fi channel.
Starting from the list of PRI values remaining after step 122, the computer builds a group of mutually compatible PRI values, at a group building step 124. Each such group will include k members, where k.gtoreq.2. The controller 26 stores records for these groups. During operation of the mobile communication device 10, the controller will then select one of the set of PRI values to be used in the camera 20, for example based on the operating environment and the radio frequency actually used by the transceiver 12, as described above.
At step 124, various methods may be employed in constructing the PRI value sets. In this embodiment, for example, at the start PRI selection step 130, the computer starts with the maximum PRI value remaining in the selection list. The computer then searches for another smaller PRI value that is comparable to the other values that have been selected for inclusion in the group, at a further PRI selection step 132. The computer starts by searching for a PRI value that is close to a value already in the group while ensuring that there is at least one Wi-Fi band in which harmonics of any PRI in the group will not interfere. The PRI value that does not meet this latter requirement is not selected in step 132. This addition and evaluation of PRI values continues iteratively with the process incorporated in the present group until the group has k member PRI values, at group completion step 134.
After a given set of k PRI values has been assembled, the computer returns to step 130 to construct the next set of PRI values. The process of building PRI groups continues until a sufficient number of groups have been built and stored, at a record completion step 136. For example, the computer may examine harmonics of the PRI in each group to ensure that for each radio frequency band (including cellular and Wi-Fi bands) available to transceiver 12, the presence of at least one group of PRI values will not interfere with that band. The controller 26 will then be able to select the appropriate PRI set at step 126 to accommodate the actual radio frequency used at any given time.
It should be understood that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
Claims (18)
1. A sensing device, comprising:
a radiation source configured to emit pulses of optical radiation towards a plurality of points in a target scene;
A receiver configured to receive the optical radiation reflected from the target scene and to output a signal indicative of a respective time of flight of the pulses to and from the point in the target scene in response to the received optical radiation;
A processing and control circuit configured to select a first PRI and a second PRI greater than the first PRI from an allowable range of pulse repetition intervals PRI, and to drive the radiation source to emit a first sequence of the pulses at the first PRI and a second sequence of the pulses at the second PRI, and to process the signals output by the receiver in response to both the first sequence and the second sequence of the pulses so as to calculate respective depth coordinates of the point in the target scene; and
A radio transceiver that communicates over the air by receiving signals in an allocated frequency band, wherein the processing and control circuitry is configured to identify an allowed range of PRI in response to the allocated frequency band.
2. The apparatus of claim 1, wherein the radiation source comprises an array of vertical cavity surface emitting laser VCSELs.
3. The apparatus of claim 1, wherein the radiation source comprises an array of emitters arranged in a plurality of groups, and wherein the processing and control circuitry is configured to sequentially drive the plurality of groups such that each group emits a respective first and second sequence of the pulses at the first and second PRIs.
4. The apparatus of claim 1, wherein the receiver comprises a single photon avalanche diode SPAD.
5. The apparatus of claim 1, wherein the first PRI defines a range limit at which a time of flight of the pulse is equal to the first PRI, and wherein the processing and control circuitry is configured to compare the signals output by the receiver in response to the first and second sequences of pulses to distinguish points in the scene having respective depth coordinates less than the range limit from points in the scene having respective depth coordinates greater than the range limit to account for distance folding of the depth coordinates.
6. The apparatus of claim 5, wherein the processing and control circuitry is configured to calculate, for each of the points in the scene, respective first and second histograms of time of flight of the pulses in the first and second sequences, and to detect that a distance fold has occurred at a given point in response to a difference between the first and second histograms.
7. The apparatus of any of claims 1-6, wherein the allowed range is defined such that PRIs in the allowed range are free of harmonics within the allocated frequency band.
8. The apparatus of any of claims 1-6, wherein the processing and control circuitry is configured to modify the allowed range in response to a change in the allocated frequency band of the radio transceiver, and to select a new value for one or both of the first PRI and the second PRI such that the new value falls within the modified range.
9. The apparatus of any of claims 1-6, wherein the processing and control circuitry is configured to store a record of a plurality of groups of the PRI, identify an operating environment of the apparatus, and select one of the groups for application in driving the radiation source in response to the identified operating environment.
10. The apparatus of claim 9, wherein the processing and control circuitry is configured to select the one of the groups in response to a geographic region in which the apparatus is operating.
11. The apparatus of claim 9, wherein the groups of the PRI have respective priorities assigned in response to a likelihood of interfering with frequencies used by the radio transceiver, and wherein the processing and control circuitry is configured to select the one of the groups in response to the respective priorities.
12. The apparatus of claim 9, wherein PRIs in each group are mutually exclusive with other PRIs in the group.
13. The apparatus of any of claims 1 to 6, wherein the processing and control circuitry is configured to select a third PRI greater than the second PRI from the allowed range of PRIs, and to drive the radiation source to emit a third sequence of the pulses at the third PRI, and to process the signals output by the receiver in response to the first, second and third sequences of pulses, so as to calculate the respective depth coordinates of the point in the target scene.
14. The apparatus of any of claims 1-6, wherein the processing and control circuitry is configured to select the first PRI and the second PRI to maximize the range of depth coordinates while maintaining a resolution of the depth coordinates no greater than a predefined resolution limit.
15. A method for sensing, comprising:
A first PRI and a second PRI larger than the first PRI are selected from an allowed range of pulse repetition intervals PRI,
Wherein selecting the first PRI and the second PRI comprises identifying an allowed range of PRIs in response to an allocated frequency band of a radio transceiver, the radio transceiver communicating over the air in the allocated frequency band in proximity to a radiation source, and selecting the first PRI and the second PRI from the allowed range;
driving a radiation source to emit a first sequence of pulses of optical radiation at the first PRI and a second sequence of pulses of the optical radiation at the second PRI towards each of a plurality of points in a target scene;
Receiving the optical radiation reflected from the target scene, and outputting a signal indicative of a respective time of flight of the pulses to and from the point in the target scene in response to the received optical radiation; and
The signals output in response to both the first and second sequences of pulses are processed in order to calculate respective depth coordinates of the point in the target scene.
16. The method of claim 15, wherein the first PRI defines a range limit at which a time of flight of the pulses is equal to the first PRI, and wherein processing the signals comprises comparing the signals output in response to the first and second sequences of pulses to distinguish points in the scene having respective depth coordinates less than the range limit from points in the scene having respective depth coordinates greater than the range limit to account for distance folding of the depth coordinates.
17. The method of claim 15 or 16, wherein the allowed range is defined such that the PRI in the allowed range is free of harmonics within the allocated frequency band.
18. The method of claim 15 or 16, and comprising selecting a third PRI greater than the second PRI from the allowed range of PRIs, wherein driving the radiation source comprises directing a third sequence of the pulses of the third PRI toward the target scene, and wherein processing the signals comprises using the signals output in response to the first, second, and third sequences of pulses to calculate the respective depth coordinates of the point in the target scene.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962859211P | 2019-06-10 | 2019-06-10 | |
US62/859,211 | 2019-06-10 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112068149A CN112068149A (en) | 2020-12-11 |
CN112068149B true CN112068149B (en) | 2024-05-10 |
Family
ID=70975725
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010521767.4A Active CN112068149B (en) | 2019-06-10 | 2020-06-10 | Sensing device and method for sensing |
Country Status (5)
Country | Link |
---|---|
US (1) | US11500094B2 (en) |
EP (2) | EP3751307B1 (en) |
KR (2) | KR102433815B1 (en) |
CN (1) | CN112068149B (en) |
TW (1) | TWI745998B (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10830879B2 (en) | 2017-06-29 | 2020-11-10 | Apple Inc. | Time-of-flight depth mapping with parallax compensation |
US11774563B2 (en) * | 2017-12-04 | 2023-10-03 | Ams International Ag | Time-of-flight module and method to determine distance to object |
EP3704510B1 (en) | 2017-12-18 | 2022-10-05 | Apple Inc. | Time-of-flight sensing using an addressable array of emitters |
US11587247B1 (en) | 2019-04-03 | 2023-02-21 | Meta Platforms Technologies, Llc | Synchronous event driven readout of pixels in a detector for direct time-of-flight depth sensing |
US11500094B2 (en) * | 2019-06-10 | 2022-11-15 | Apple Inc. | Selection of pulse repetition intervals for sensing time of flight |
US11733359B2 (en) | 2019-12-03 | 2023-08-22 | Apple Inc. | Configurable array of single-photon detectors |
US11480684B2 (en) | 2020-06-18 | 2022-10-25 | Meta Platforms Technologies, Llc | Time of flight depth system including an illumination source with addressable illumination blocks |
US11747438B1 (en) * | 2021-01-07 | 2023-09-05 | Bae Systems Information And Electronic Systems Integration Inc. | Cognitive electronic warfare scheduler |
CN113406637B (en) * | 2021-06-23 | 2022-11-01 | 电子科技大学 | Joint iterative tomography method based on dual-frequency narrow-band signals |
WO2023276222A1 (en) * | 2021-06-29 | 2023-01-05 | ソニーセミコンダクタソリューションズ株式会社 | Light detection device, light detection system, and light detection method |
US11681028B2 (en) | 2021-07-18 | 2023-06-20 | Apple Inc. | Close-range measurement of time of flight using parallax shift |
CN117368854A (en) * | 2022-06-30 | 2024-01-09 | 华为技术有限公司 | Multi-target detection device, detection method and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107438774A (en) * | 2015-04-20 | 2017-12-05 | 瑞思迈传感器技术有限公司 | Multisensor radio frequency detects |
EP3285087A1 (en) * | 2016-08-19 | 2018-02-21 | ams AG | Sensor arrangement and method for determining time-of-flight |
CN109791202A (en) * | 2016-09-22 | 2019-05-21 | 苹果公司 | Laser radar with irregular pulse train |
CN109791195A (en) * | 2016-09-22 | 2019-05-21 | 苹果公司 | The adaptive transmission power control reached for light |
Family Cites Families (213)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6384903B1 (en) | 1977-02-28 | 2002-05-07 | Bae Systems Information And Electronic Systems Integration, Inc. | Range gated remote measurement utilizing two-photon absorption |
US4623237A (en) | 1984-07-07 | 1986-11-18 | Canon Kabushiki Kaisha | Automatic focusing device |
US4757200A (en) | 1986-07-28 | 1988-07-12 | Visidyne, Inc. | Pulsed radiation detection system |
GB8902080D0 (en) | 1989-01-31 | 1989-03-22 | York Ltd | Optical detection apparatus for counting optical photons |
JPH02287113A (en) | 1989-04-27 | 1990-11-27 | Asahi Optical Co Ltd | Distance measuring instrument |
US5373148A (en) | 1989-10-30 | 1994-12-13 | Symbol Technologies, Inc. | Optical scanners with scan motion damping and orientation of astigmantic laser generator to optimize reading of two-dimensionally coded indicia |
US5164823A (en) | 1990-12-21 | 1992-11-17 | Kaman Aerospace Corporation | Imaging lidar system employing multipulse single and multiple gating for single and stacked frames |
JPH0567195A (en) | 1991-09-05 | 1993-03-19 | Matsushita Electric Ind Co Ltd | Shape measuring instrument |
US5270780A (en) | 1991-09-13 | 1993-12-14 | Science Applications International Corporation | Dual detector lidar system and method |
CA2605339C (en) * | 1993-04-12 | 2008-09-30 | The Regents Of The University Of California | Ultra-wideband radar motion sensor |
JP3240835B2 (en) | 1994-06-09 | 2001-12-25 | 株式会社日立製作所 | Vehicle distance measuring device |
JPH09197045A (en) | 1996-01-24 | 1997-07-31 | Nissan Motor Co Ltd | Radar device for vehicles |
JPH10170637A (en) | 1996-12-16 | 1998-06-26 | Omron Corp | Light scanner |
JPH1163920A (en) | 1997-08-26 | 1999-03-05 | Matsushita Electric Works Ltd | Optically scanning system displacement measuring equipment |
JP3832101B2 (en) | 1998-08-05 | 2006-10-11 | 株式会社デンソー | Distance measuring device |
IT1316793B1 (en) | 2000-03-09 | 2003-05-12 | Milano Politecnico | MONOLITHIC CIRCUIT OF ACTIVE SHUTDOWN AND ACTIVE RESTORATION AVALANCHE PERFOTODIODI |
JP4595197B2 (en) | 2000-12-12 | 2010-12-08 | 株式会社デンソー | Distance measuring device |
US6771103B2 (en) | 2001-03-14 | 2004-08-03 | Denso Corporation | Time measurement apparatus, distance measurement apparatus, and clock signal generating apparatus usable therein |
US6486827B2 (en) | 2001-04-18 | 2002-11-26 | Raytheon Company | Sparse frequency waveform radar system and method |
JP4457525B2 (en) | 2001-06-11 | 2010-04-28 | 株式会社デンソー | Distance measuring device |
US7187445B2 (en) | 2001-07-19 | 2007-03-06 | Automotive Distance Control Systems Gmbh | Method and apparatus for optically scanning a scene |
US7126218B1 (en) | 2001-08-07 | 2006-10-24 | Amkor Technology, Inc. | Embedded heat spreader ball grid array |
US7899339B2 (en) | 2002-07-30 | 2011-03-01 | Amplification Technologies Inc. | High-sensitivity, high-resolution detector devices and arrays |
US7312856B2 (en) | 2002-09-12 | 2007-12-25 | Lockheed Martin Corporation | Programmable pulse capture device with automatic gain control |
US20060106317A1 (en) | 2002-09-16 | 2006-05-18 | Joule Microsystems Canada Inc. | Optical system and use thereof for detecting patterns in biological tissue |
GB2395261A (en) | 2002-11-11 | 2004-05-19 | Qinetiq Ltd | Ranging apparatus |
AU2003295944A1 (en) | 2002-11-26 | 2005-02-04 | James F. Munro | An apparatus for high accuracy distance and velocity measurement and methods thereof |
US7428997B2 (en) | 2003-07-29 | 2008-09-30 | Microvision, Inc. | Method and apparatus for illuminating a field-of-view and capturing an image |
DE10361870B4 (en) | 2003-12-29 | 2006-05-04 | Faro Technologies Inc., Lake Mary | Laser scanner and method for optically scanning and measuring an environment of the laser scanner |
DE102005027208B4 (en) | 2004-11-16 | 2011-11-10 | Zoller & Fröhlich GmbH | Method for controlling a laser scanner |
EP1842082A2 (en) | 2005-01-20 | 2007-10-10 | Elbit Systems Electro-Optics Elop Ltd. | Laser obstacle detection and display |
US9002511B1 (en) | 2005-10-21 | 2015-04-07 | Irobot Corporation | Methods and systems for obstacle detection using structured light |
US7812301B2 (en) | 2005-10-28 | 2010-10-12 | Sony Corporation | Solid-state imaging device, method of driving solid-state imaging device and imaging apparatus |
US7303005B2 (en) | 2005-11-04 | 2007-12-04 | Graftech International Holdings Inc. | Heat spreaders with vias |
US8355117B2 (en) | 2005-12-21 | 2013-01-15 | Ecole Polytechnique Federale De Lausanne | Method and arrangement for measuring the distance to an object |
DE102006013290A1 (en) | 2006-03-23 | 2007-09-27 | Robert Bosch Gmbh | Device for optical distance measurement and method for operating such a device |
US7405812B1 (en) | 2006-05-18 | 2008-07-29 | Canesta, Inc. | Method and system to avoid inter-system interference for phase-based time-of-flight systems |
EP1860462A1 (en) | 2006-05-23 | 2007-11-28 | Leica Geosystems AG | Distance measuring method and distance meter for determining the spatial dimension of a target |
WO2008008970A2 (en) | 2006-07-13 | 2008-01-17 | Velodyne Acoustics, Inc | High definition lidar system |
WO2008113067A2 (en) | 2007-03-15 | 2008-09-18 | Johns Hopkins University | Deep submicron and nano cmos single photon photodetector pixel with event based circuits for readout data-rate reduction |
US7652252B1 (en) | 2007-10-08 | 2010-01-26 | Hrl Laboratories, Llc | Electronically tunable and reconfigurable hyperspectral photon detector |
US8332134B2 (en) | 2008-04-24 | 2012-12-11 | GM Global Technology Operations LLC | Three-dimensional LIDAR-based clear path detection |
EP2384182A4 (en) | 2008-04-30 | 2016-10-19 | Univ Texas | An apparatus and method for noninvasive evaluation of a target versus a non-target |
US20090273770A1 (en) | 2008-04-30 | 2009-11-05 | Honeywell International Inc. | Systems and methods for safe laser imaging, detection and ranging (lidar) operation |
DE102008031681A1 (en) | 2008-07-04 | 2010-01-14 | Eads Deutschland Gmbh | LIDAR method for measuring velocities and LIDAR device with timed detection |
US8026471B2 (en) | 2008-07-23 | 2011-09-27 | Princeton Lightwave, Inc. | Single-photon avalanche detector-based focal plane array |
JP5585903B2 (en) | 2008-07-30 | 2014-09-10 | 国立大学法人静岡大学 | Distance image sensor and method for generating imaging signal by time-of-flight method |
IL200332A0 (en) | 2008-08-19 | 2010-04-29 | Rosemount Aerospace Inc | Lidar system using a pseudo-random pulse sequence |
US9554770B2 (en) | 2008-09-29 | 2017-01-31 | Siemens Medical Solutions Usa, Inc. | High pulse repetition frequency for detection of tissue mechanical property with ultrasound |
US20100096459A1 (en) | 2008-10-16 | 2010-04-22 | Vladimir Gurevich | Electro-optical reader with extended working range |
IT1392366B1 (en) | 2008-12-17 | 2012-02-28 | St Microelectronics Rousset | OPERATING PHOTODIODO IN GEIGER MODE WITH INTEGRATED AND CONTROLLABLE SUPPRESSION RESISTOR, PHOTODIUM RING AND RELATIVE PROCESS OF PROCESSING |
US8447563B2 (en) | 2009-03-31 | 2013-05-21 | The United States Of America As Represented By The Secretary Of The Navy | Method and system for determination of detection probability or a target object based on a range |
WO2010141631A1 (en) | 2009-06-02 | 2010-12-09 | Velodyne Acoustics, Inc. | Color lidar scanner |
CN101923173B (en) | 2009-06-10 | 2014-10-01 | 圣戈本陶瓷及塑料股份有限公司 | Scintillator and detector assembly |
US9417326B2 (en) | 2009-06-22 | 2016-08-16 | Toyota Motor Europe Nv/Sa | Pulsed light optical rangefinder |
US8319170B2 (en) | 2009-07-10 | 2012-11-27 | Motorola Mobility Llc | Method for adapting a pulse power mode of a proximity sensor |
DE102009029372A1 (en) | 2009-09-11 | 2011-03-24 | Robert Bosch Gmbh | Measuring device for measuring a distance between the measuring device and a target object by means of optical measuring radiation |
DE102009045323A1 (en) | 2009-10-05 | 2011-04-07 | Robert Bosch Gmbh | Optical distance measuring device with calibration device |
JP2011089874A (en) | 2009-10-22 | 2011-05-06 | Toyota Central R&D Labs Inc | Distance image data acquisition device |
US8390791B2 (en) | 2009-11-30 | 2013-03-05 | General Electric Company | Light detection and ranging system |
US20110187878A1 (en) | 2010-02-02 | 2011-08-04 | Primesense Ltd. | Synchronization of projected illumination with rolling shutter of image sensor |
WO2011112633A1 (en) | 2010-03-09 | 2011-09-15 | Flir Systems, Inc. | Imager with multiple sensor arrays |
DE102010003843A1 (en) | 2010-04-12 | 2011-10-13 | Robert Bosch Gmbh | Distance measuring device with homogenizing measurement evaluation |
JP5633181B2 (en) | 2010-05-07 | 2014-12-03 | 株式会社ニコン | Depth map output device |
LU91688B1 (en) | 2010-05-17 | 2011-11-18 | Iee Sarl | Scanning 3D imager |
US8687174B2 (en) | 2010-08-11 | 2014-04-01 | Samsung Electronics Co., Ltd. | Unit pixel, photo-detection device and method of measuring a distance using the same |
US8736818B2 (en) | 2010-08-16 | 2014-05-27 | Ball Aerospace & Technologies Corp. | Electronically steered flash LIDAR |
US8836250B2 (en) | 2010-10-01 | 2014-09-16 | Accuray Incorporated | Systems and methods for cargo scanning and radiotherapy using a traveling wave linear accelerator based x-ray source using current to modulate pulse-to-pulse dosage |
GB2486165A (en) | 2010-11-30 | 2012-06-13 | St Microelectronics Res & Dev | Oven using a Single Photon Avalanche Diode (SPAD) array |
GB2485995B (en) | 2010-11-30 | 2014-01-01 | St Microelectronics Res & Dev | Improved proximity sensor and associated method, computer readable medium and firmware |
US8803952B2 (en) | 2010-12-20 | 2014-08-12 | Microsoft Corporation | Plural detector time-of-flight depth mapping |
AT510296B1 (en) | 2010-12-21 | 2012-03-15 | Riegl Laser Measurement Sys | METHOD OF REMOTE MEASUREMENT BY MEANS OF LASER IMPULSES |
EP2469301A1 (en) | 2010-12-23 | 2012-06-27 | André Borowski | Methods and devices for generating a representation of a 3D scene at very high speed |
EP2477043A1 (en) | 2011-01-12 | 2012-07-18 | Sony Corporation | 3D time-of-flight camera and method |
JP5834416B2 (en) | 2011-02-01 | 2015-12-24 | セイコーエプソン株式会社 | Image forming apparatus |
KR101318951B1 (en) | 2011-02-28 | 2013-10-17 | 한국과학기술원 | Scanning three-dimensional imaging pulsed laser radar System and Method using dual Geiger-mode avalanche photodiodes |
DE102011005746A1 (en) | 2011-03-18 | 2012-09-20 | Robert Bosch Gmbh | Measuring device for multi-dimensional measurement of a target object |
WO2012137109A2 (en) | 2011-04-05 | 2012-10-11 | Koninklijke Philips Electronics N.V. | Detector array with time-to-digital conversion having improved temporal accuracy |
EP2702761A4 (en) | 2011-04-25 | 2014-11-26 | Generic Imaging Ltd | System and method for linearization of multi-camera flat panel x-ray detectors |
US9137463B2 (en) | 2011-05-12 | 2015-09-15 | Microsoft Technology Licensing, Llc | Adaptive high dynamic range camera |
EP2535755A1 (en) | 2011-06-14 | 2012-12-19 | Ecole Polytechnique Fédérale de Lausanne (EPFL) | Cumulant microscopy |
US11231502B2 (en) | 2011-06-30 | 2022-01-25 | The Regents Of The University Of Colorado | Remote measurement of shallow depths in semi-transparent media |
DE102011107645A1 (en) | 2011-07-12 | 2013-01-17 | Leica Microsystems Cms Gmbh | Apparatus and method for detecting light |
DE102011081561A1 (en) | 2011-08-25 | 2013-02-28 | Ifm Electronic Gmbh | Time of flight camera system with signal path monitoring |
WO2013028691A1 (en) | 2011-08-25 | 2013-02-28 | Georgia Tech Research Corporation | Gas sensors and methods of preparation thereof |
US9538987B2 (en) | 2011-09-28 | 2017-01-10 | General Electric Company | System and method for ultrasound imaging |
US9341464B2 (en) | 2011-10-17 | 2016-05-17 | Atlas5D, Inc. | Method and apparatus for sizing and fitting an individual for apparel, accessories, or prosthetics |
US20130092846A1 (en) | 2011-10-18 | 2013-04-18 | Uwm Research Foundation, Inc. | Fiber-optic sensors for real-time monitoring |
JP2013113669A (en) | 2011-11-28 | 2013-06-10 | Mitsubishi Electric Corp | Laser radar device |
US9024246B2 (en) | 2011-12-19 | 2015-05-05 | Princeton Lightwave, Inc. | Two-state negative feedback avalanche diode having a control element for determining load state |
FR2984522B1 (en) | 2011-12-20 | 2014-02-14 | St Microelectronics Grenoble 2 | DEVICE FOR DETECTING THE PROXIMITY OF AN OBJECT, COMPRISING SPAD PHOTODIODS |
US9052356B2 (en) | 2012-02-15 | 2015-06-09 | International Business Machines Corporation | Embedded photon emission calibration (EPEC) |
US8989596B2 (en) | 2012-03-06 | 2015-03-24 | Northrop Grumman Systems Corporation | Multiple sensor optical communication systems and methods |
US9109888B2 (en) | 2012-03-21 | 2015-08-18 | Honda Motor Co., Ltd. | Distance measuring system |
US9335220B2 (en) | 2012-03-22 | 2016-05-10 | Apple Inc. | Calibration of time-of-flight measurement using stray reflections |
US20130258099A1 (en) | 2012-03-29 | 2013-10-03 | Samsung Electronics Co., Ltd. | Depth Estimation Device And Operating Method Using The Depth Estimation Device |
US9723233B2 (en) | 2012-04-18 | 2017-08-01 | Brightway Vision Ltd. | Controllable gated sensor |
KR102038533B1 (en) | 2012-06-14 | 2019-10-31 | 한국전자통신연구원 | Laser Radar System and Method for Acquiring Target Image |
US20130342835A1 (en) | 2012-06-25 | 2013-12-26 | California Institute Of Technology | Time resolved laser raman spectroscopy using a single photon avalanche diode array |
GB2504291A (en) | 2012-07-24 | 2014-01-29 | St Microelectronics Ltd | A proximity and gesture detection module |
EP2708914A1 (en) | 2012-09-18 | 2014-03-19 | Sick Ag | Optoelectronic sensor and method for recording a depth map |
KR102010807B1 (en) * | 2012-12-06 | 2019-08-14 | 삼성전자 주식회사 | Information exchange method and apparatus for d2d communication |
GB201300334D0 (en) | 2013-01-09 | 2013-02-20 | St Microelectronics Ltd | Sensor circuit |
GB2510890A (en) | 2013-02-18 | 2014-08-20 | St Microelectronics Res & Dev | Method and apparatus |
KR102048361B1 (en) | 2013-02-28 | 2019-11-25 | 엘지전자 주식회사 | Distance detecting device and Image processing apparatus including the same |
DE202013101039U1 (en) | 2013-03-11 | 2014-03-12 | Sick Ag | Optoelectronic sensor for distance measurement |
US9182278B2 (en) | 2013-03-14 | 2015-11-10 | Sciaps, Inc. | Wide spectral range spectrometer |
US9516248B2 (en) | 2013-03-15 | 2016-12-06 | Microsoft Technology Licensing, Llc | Photosensor having enhanced sensitivity |
US9076707B2 (en) | 2013-04-19 | 2015-07-07 | Lightspin Technologies, Inc. | Integrated avalanche photodiode arrays |
GB2513408B (en) | 2013-04-26 | 2017-12-13 | Toshiba Res Europe Limited | A photon detector and a photon detection method |
US10269104B2 (en) | 2013-04-29 | 2019-04-23 | Nokia Technologies Oy | Method and apparatus for fusing distance data from a distance sensing camera with an image |
GB2514576A (en) | 2013-05-29 | 2014-12-03 | St Microelectronics Res & Dev | Methods and apparatus |
US20150260830A1 (en) | 2013-07-12 | 2015-09-17 | Princeton Optronics Inc. | 2-D Planar VCSEL Source for 3-D Imaging |
US9268012B2 (en) | 2013-07-12 | 2016-02-23 | Princeton Optronics Inc. | 2-D planar VCSEL source for 3-D imaging |
GB2520232A (en) | 2013-08-06 | 2015-05-20 | Univ Edinburgh | Multiple Event Time to Digital Converter |
US10061028B2 (en) | 2013-09-05 | 2018-08-28 | Texas Instruments Incorporated | Time-of-flight (TOF) assisted structured light imaging |
US8925814B1 (en) | 2013-10-09 | 2015-01-06 | Symbol Technologies, Inc. | Apparatus for and method of monitoring output power of a laser beam during reading of targets |
US9443310B2 (en) | 2013-10-09 | 2016-09-13 | Microsoft Technology Licensing, Llc | Illumination modules that emit structured light |
US10063844B2 (en) | 2013-10-17 | 2018-08-28 | Microsoft Technology Licensing, Llc. | Determining distances by probabilistic time of flight imaging |
US10203399B2 (en) | 2013-11-12 | 2019-02-12 | Big Sky Financial Corporation | Methods and apparatus for array based LiDAR systems with reduced interference |
JP6489320B2 (en) | 2013-11-20 | 2019-03-27 | パナソニックIpマネジメント株式会社 | Ranging imaging system |
US9210350B2 (en) | 2013-12-09 | 2015-12-08 | Omnivision Technologies, Inc. | Low power imaging system with single photon avalanche diode photon counters and ghost image reduction |
US9625580B2 (en) | 2014-01-03 | 2017-04-18 | Princeton Lightwave, Inc. | LiDAR system comprising a single-photon detector |
US9331116B2 (en) | 2014-01-15 | 2016-05-03 | Omnivision Technologies, Inc. | Back side illuminated single photon avalanche diode imaging sensor with high short wavelength detection efficiency |
US9312401B2 (en) | 2014-01-15 | 2016-04-12 | Omnivision Technologies, Inc. | Single photon avalanche diode imaging sensor for complementary metal oxide semiconductor stacked chip applications |
JP6207407B2 (en) | 2014-01-17 | 2017-10-04 | オムロンオートモーティブエレクトロニクス株式会社 | Laser radar apparatus, object detection method, and program |
US9456201B2 (en) | 2014-02-10 | 2016-09-27 | Microsoft Technology Licensing, Llc | VCSEL array for a depth camera |
CN103763485A (en) | 2014-02-17 | 2014-04-30 | 苏州超锐微电子有限公司 | Single-photon level resolution ratio image capturing chip front-end circuit module for intelligent image sensor |
CN106104297B (en) | 2014-03-14 | 2020-06-30 | 赫普塔冈微光有限公司 | Optoelectronic module operable to identify spurious reflections and compensate for errors caused by spurious reflections |
US9761049B2 (en) * | 2014-03-28 | 2017-09-12 | Intel Corporation | Determination of mobile display position and orientation using micropower impulse radar |
US9952323B2 (en) | 2014-04-07 | 2018-04-24 | Samsung Electronics Co., Ltd. | High resolution, high frame rate, low power image sensor |
US10419703B2 (en) | 2014-06-20 | 2019-09-17 | Qualcomm Incorporated | Automatic multiple depth cameras synchronization using time sharing |
CN106662433B (en) | 2014-06-27 | 2019-09-06 | 新加坡恒立私人有限公司 | Structured light imaging system and method |
DE202014005508U1 (en) | 2014-07-02 | 2014-10-09 | Robert Bosch Gmbh | Distance measuring device |
EP3171241A4 (en) | 2014-07-16 | 2017-12-13 | Ricoh Company, Ltd. | System, machine, control method, and program |
US9377533B2 (en) | 2014-08-11 | 2016-06-28 | Gerard Dirk Smits | Three-dimensional triangulation and time-of-flight based tracking systems and methods |
US9810777B2 (en) | 2014-08-22 | 2017-11-07 | Voxtel, Inc. | Asynchronous LADAR and imaging array |
US20160072258A1 (en) | 2014-09-10 | 2016-03-10 | Princeton Optronics Inc. | High Resolution Structured Light Source |
US9596440B2 (en) | 2014-09-11 | 2017-03-14 | Microvision, Inc. | Scanning laser planarity detection |
US10295658B2 (en) | 2014-10-02 | 2019-05-21 | The Johns Hopkins University | Optical detection system |
TWI679442B (en) | 2014-12-02 | 2019-12-11 | 新加坡商新加坡恒立私人有限公司 | Depth sensor module and depth sensing method |
US10036801B2 (en) | 2015-03-05 | 2018-07-31 | Big Sky Financial Corporation | Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array |
CN104730535A (en) | 2015-03-20 | 2015-06-24 | 武汉科技大学 | Vehicle-mounted Doppler laser radar distance measuring method |
US10088557B2 (en) | 2015-03-20 | 2018-10-02 | MSOTEK Co., Ltd | LIDAR apparatus |
FR3034204A1 (en) | 2015-03-23 | 2016-09-30 | Stmicroelectronics (Grenoble 2) Sas | |
US10132616B2 (en) | 2015-04-20 | 2018-11-20 | Samsung Electronics Co., Ltd. | CMOS image sensor for 2D imaging and depth measurement with ambient light rejection |
US9864048B2 (en) | 2015-05-17 | 2018-01-09 | Microsoft Technology Licensing, Llc. | Gated time of flight camera |
US10488549B2 (en) | 2015-05-18 | 2019-11-26 | Lasermotive, Inc. | Locating power receivers |
CN104914446B (en) | 2015-06-19 | 2017-06-27 | 南京理工大学 | Three-dimensional distance image time domain real-time de-noising method based on photon counting |
EP3113478A1 (en) | 2015-06-30 | 2017-01-04 | Thomson Licensing | Plenoptic foveated camera |
US10620300B2 (en) | 2015-08-20 | 2020-04-14 | Apple Inc. | SPAD array with gated histogram construction |
US9989357B2 (en) | 2015-09-09 | 2018-06-05 | Faro Technologies, Inc. | Aerial device that cooperates with an external projector to measure three-dimensional coordinates |
US10063849B2 (en) | 2015-09-24 | 2018-08-28 | Ouster, Inc. | Optical system for collecting distance information within a field |
CN108604053B (en) | 2015-10-21 | 2021-11-02 | 普林斯顿光电子股份有限公司 | Coding pattern projector |
US10067224B2 (en) | 2015-10-22 | 2018-09-04 | Stmicroelectronics (Research & Development) Limited | Time to digital converter (TDC) with synchronous output and related methods |
EP3159711A1 (en) | 2015-10-23 | 2017-04-26 | Xenomatix NV | System and method for determining a distance to an object |
JP2018533026A (en) | 2015-11-05 | 2018-11-08 | ルミナー テクノロジーズ インコーポレイテッド | Lidar system with improved scan speed for creating high resolution depth maps |
FR3043797A1 (en) | 2015-11-16 | 2017-05-19 | Stmicroelectronics (Grenoble 2) Sas | |
CN108603758A (en) | 2015-11-30 | 2018-09-28 | 卢米诺技术公司 | The pulse laser of laser radar system and laser radar system with distribution type laser device and multiple sensor heads |
EP3182156B1 (en) | 2015-12-18 | 2021-01-27 | STMicroelectronics (Research & Development) Limited | Ranging apparatus |
JP6854828B2 (en) | 2015-12-18 | 2021-04-07 | ジェラルド ディルク スミッツ | Real-time position detection of an object |
US9997551B2 (en) | 2015-12-20 | 2018-06-12 | Apple Inc. | Spad array with pixel-level bias control |
US10324171B2 (en) | 2015-12-20 | 2019-06-18 | Apple Inc. | Light detection and ranging sensor |
EP3185037B1 (en) | 2015-12-23 | 2020-07-08 | STMicroelectronics (Research & Development) Limited | Depth imaging system |
EP3185038B1 (en) | 2015-12-23 | 2018-02-14 | Sick Ag | Optoelectronic sensor and method for measuring a distance |
US9823118B2 (en) | 2015-12-26 | 2017-11-21 | Intel Corporation | Low power, high resolution solid state LIDAR circuit |
US10386487B1 (en) | 2015-12-30 | 2019-08-20 | Argo AI, LLC | Geiger-mode LiDAR system having improved signal-to-noise ratio |
US10627490B2 (en) | 2016-01-31 | 2020-04-21 | Velodyne Lidar, Inc. | Multiple pulse, LIDAR based 3-D imaging |
US10754015B2 (en) | 2016-02-18 | 2020-08-25 | Aeye, Inc. | Adaptive ladar receiver |
US9933513B2 (en) | 2016-02-18 | 2018-04-03 | Aeye, Inc. | Method and apparatus for an adaptive ladar receiver |
US9866816B2 (en) | 2016-03-03 | 2018-01-09 | 4D Intellectual Properties, Llc | Methods and apparatus for an active pulsed 4D camera for image acquisition and analysis |
JP2019512704A (en) | 2016-03-21 | 2019-05-16 | ベロダイン ライダー, インク. | Three-dimensional imaging based on LIDAR with variable pulse repetition rate |
US9739881B1 (en) | 2016-03-24 | 2017-08-22 | RFNAV, Inc. | Low cost 3D radar imaging and 3D association method from low count linear arrays for all weather autonomous vehicle navigation |
JP2017195573A (en) | 2016-04-22 | 2017-10-26 | ソニー株式会社 | Imaging apparatus and electronic apparatus |
CA3022417C (en) | 2016-04-26 | 2022-10-18 | Illinois Institute Of Technology | Apparatus and method for enhanced early photon detection in optical projection tomography |
US10690756B2 (en) | 2016-05-10 | 2020-06-23 | Texas Instruments Incorporated | Methods and apparatus for LIDAR operation with pulse position modulation |
JP6780308B2 (en) | 2016-06-10 | 2020-11-04 | 株式会社リコー | Object detection device, sensing device and mobile device |
US10823826B2 (en) | 2016-06-14 | 2020-11-03 | Stmicroelectronics, Inc. | Adaptive laser power and ranging limit for time of flight sensor |
US20180341009A1 (en) | 2016-06-23 | 2018-11-29 | Apple Inc. | Multi-range time of flight sensing |
US10890649B2 (en) | 2016-08-11 | 2021-01-12 | Qualcomm Incorporated | System and method for measuring reference and returned light beams in an optical system |
EP3497477A1 (en) | 2016-08-12 | 2019-06-19 | Fastree3D SA | Method and device for measuring a distance to a target in a multi-user environment by means of at least one detector |
US20180059220A1 (en) | 2016-08-30 | 2018-03-01 | Qualcomm Incorporated | Laser ranging device with beam signature and signature recognition |
US10305247B2 (en) | 2016-08-30 | 2019-05-28 | Apple Inc. | Radiation source with a small-angle scanning array |
US10291895B2 (en) | 2016-10-25 | 2019-05-14 | Omnivision Technologies, Inc. | Time of flight photosensor |
DE102016221049A1 (en) | 2016-10-26 | 2018-04-26 | Robert Bosch Gmbh | Apparatus and method for receiving a reflected light pulse in a lidar system |
CN106405572B (en) | 2016-11-10 | 2019-02-26 | 西安交通大学 | Remote high-resolution laser Active Imaging device and method based on space encoding |
GB201622429D0 (en) | 2016-12-30 | 2017-02-15 | Univ Court Of The Univ Of Edinburgh The | Photon sensor apparatus |
US10154254B2 (en) | 2017-01-17 | 2018-12-11 | Facebook Technologies, Llc | Time-of-flight depth sensing for eye tracking |
EP3355133B1 (en) | 2017-01-25 | 2019-10-30 | ams AG | Method for calibrating a time-to-digital converter system and time-to-digital converter system |
CN110235024B (en) * | 2017-01-25 | 2022-10-28 | 苹果公司 | SPAD detector with modulation sensitivity |
US11105925B2 (en) | 2017-03-01 | 2021-08-31 | Ouster, Inc. | Accurate photo detector measurements for LIDAR |
CN114114209A (en) | 2017-03-01 | 2022-03-01 | 奥斯特公司 | Accurate photodetector measurement for LIDAR |
US10830879B2 (en) | 2017-06-29 | 2020-11-10 | Apple Inc. | Time-of-flight depth mapping with parallax compensation |
SG11201913642VA (en) | 2017-07-05 | 2020-01-30 | Ouster Inc | Light ranging device with electronically scanned emitter array and synchronized sensor array |
EP3428683B1 (en) | 2017-07-11 | 2019-08-28 | Sick Ag | Optoelectronic sensor and method for measuring a distance |
EP3428574A1 (en) | 2017-07-11 | 2019-01-16 | Fondazione Bruno Kessler | Device for measuring a distance and method for measuring said distance |
US20190018119A1 (en) * | 2017-07-13 | 2019-01-17 | Apple Inc. | Early-late pulse counting for light emitting depth sensors |
US10955552B2 (en) | 2017-09-27 | 2021-03-23 | Apple Inc. | Waveform design for a LiDAR system with closely-spaced pulses |
TWI661211B (en) * | 2017-12-08 | 2019-06-01 | 財團法人工業技術研究院 | Ranging device and method thereof |
EP3521856B1 (en) | 2018-01-31 | 2023-09-13 | ams AG | Time-of-flight arrangement and method for a time-of-flight measurement |
US10996323B2 (en) | 2018-02-22 | 2021-05-04 | Stmicroelectronics (Research & Development) Limited | Time-of-flight imaging device, system and method |
DE102018203534A1 (en) | 2018-03-08 | 2019-09-12 | Ibeo Automotive Systems GmbH | Receiver arrangement for receiving light pulses, LiDAR module and method for receiving light pulses |
US10158038B1 (en) | 2018-05-17 | 2018-12-18 | Hi Llc | Fast-gated photodetector architectures comprising dual voltage sources with a switch configuration |
US11543495B2 (en) | 2018-11-01 | 2023-01-03 | Waymo Llc | Shot reordering in LIDAR systems |
WO2020101576A1 (en) | 2018-11-16 | 2020-05-22 | Ams Sensors Singapore Pte. Ltd. | Depth sensing using optical time-of-flight techniques through a transmissive cover |
DE102018220688A1 (en) | 2018-11-30 | 2020-06-04 | Ibeo Automotive Systems GmbH | Analog-to-digital converter |
CN113748356A (en) | 2019-01-18 | 2021-12-03 | 感觉光子公司 | Digital pixel and operation method thereof |
CN113330328A (en) | 2019-02-11 | 2021-08-31 | 苹果公司 | Depth sensing using a sparse array of pulsed beams |
EP3715907B1 (en) | 2019-03-27 | 2024-02-28 | Infineon Technologies AG | Methods and apparatuses for compensating light reflections from a cover of a time-of-flight camera |
DE112020001783T5 (en) | 2019-04-02 | 2021-12-30 | Ams International Ag | Time-of-flight sensor |
US11500094B2 (en) * | 2019-06-10 | 2022-11-15 | Apple Inc. | Selection of pulse repetition intervals for sensing time of flight |
US10613203B1 (en) | 2019-07-01 | 2020-04-07 | Velodyne Lidar, Inc. | Interference mitigation for light detection and ranging |
CN110609293B (en) | 2019-09-19 | 2022-05-27 | 深圳奥锐达科技有限公司 | Distance detection system and method based on flight time |
-
2020
- 2020-05-28 US US16/885,316 patent/US11500094B2/en active Active
- 2020-06-01 EP EP20177707.5A patent/EP3751307B1/en active Active
- 2020-06-01 EP EP22167103.5A patent/EP4050369B1/en active Active
- 2020-06-05 KR KR1020200068248A patent/KR102433815B1/en active IP Right Grant
- 2020-06-09 TW TW109119267A patent/TWI745998B/en active
- 2020-06-10 CN CN202010521767.4A patent/CN112068149B/en active Active
-
2022
- 2022-08-12 KR KR1020220101419A patent/KR102518450B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107438774A (en) * | 2015-04-20 | 2017-12-05 | 瑞思迈传感器技术有限公司 | Multisensor radio frequency detects |
EP3285087A1 (en) * | 2016-08-19 | 2018-02-21 | ams AG | Sensor arrangement and method for determining time-of-flight |
CN109791202A (en) * | 2016-09-22 | 2019-05-21 | 苹果公司 | Laser radar with irregular pulse train |
CN109791195A (en) * | 2016-09-22 | 2019-05-21 | 苹果公司 | The adaptive transmission power control reached for light |
Also Published As
Publication number | Publication date |
---|---|
EP3751307B1 (en) | 2022-06-22 |
KR102433815B1 (en) | 2022-08-18 |
US11500094B2 (en) | 2022-11-15 |
EP4050369A1 (en) | 2022-08-31 |
KR20220117190A (en) | 2022-08-23 |
TWI745998B (en) | 2021-11-11 |
TW202102877A (en) | 2021-01-16 |
EP4050369B1 (en) | 2024-06-05 |
KR20200141937A (en) | 2020-12-21 |
US20200386890A1 (en) | 2020-12-10 |
KR102518450B1 (en) | 2023-04-04 |
EP3751307A1 (en) | 2020-12-16 |
CN112068149A (en) | 2020-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112068149B (en) | Sensing device and method for sensing | |
US11703569B2 (en) | LIDAR data acquisition and control | |
US20210181317A1 (en) | Time-of-flight-based distance measurement system and method | |
CN111465870B (en) | Time-of-flight sensing using an array of addressable emitters | |
US11686826B2 (en) | Measuring time-of-flight using a plurality of detector subsystems and histogram storage | |
US20180081041A1 (en) | LiDAR with irregular pulse sequence | |
CN113330327A (en) | Depth sensing calibration using a sparse array of pulsed beams | |
US10955552B2 (en) | Waveform design for a LiDAR system with closely-spaced pulses | |
CN111965658B (en) | Distance measurement system, method and computer readable storage medium | |
US9404999B2 (en) | Localization system and localization method | |
CN109471118A (en) | Based on the cumulative laser ranging system with waveform sampling of echo waveform | |
CN112105944A (en) | Optical ranging system with multimode operation using short and long pulses | |
CN111965659B (en) | Distance measurement system, method and computer readable storage medium | |
US20220350026A1 (en) | Selection of pulse repetition intervals for sensing time of flight | |
US11681028B2 (en) | Close-range measurement of time of flight using parallax shift | |
US20230196501A1 (en) | Systems and Methods for Memory-Efficient Pixel Histogramming | |
CN216211121U (en) | Depth information measuring device and electronic apparatus | |
WO2023114253A1 (en) | Systems and methods for memory-efficient pixel histogramming | |
CN113822875A (en) | Depth information measuring device, full-scene obstacle avoidance method and electronic equipment | |
CN117890918A (en) | Distance measurement system, method and computer readable storage medium based on time of flight | |
JP2021089248A (en) | Laser radar |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |