CN111239708B - Light detection and ranging sensor - Google Patents
Light detection and ranging sensor Download PDFInfo
- Publication number
- CN111239708B CN111239708B CN202010063812.6A CN202010063812A CN111239708B CN 111239708 B CN111239708 B CN 111239708B CN 202010063812 A CN202010063812 A CN 202010063812A CN 111239708 B CN111239708 B CN 111239708B
- Authority
- CN
- China
- Prior art keywords
- scan
- target scene
- light
- array
- sensing elements
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title abstract description 9
- 238000000034 method Methods 0.000 claims description 28
- 238000005259 measurement Methods 0.000 claims description 17
- 230000035945 sensitivity Effects 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 8
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000010408 sweeping Methods 0.000 claims 3
- 238000005286 illumination Methods 0.000 description 97
- 238000010586 diagram Methods 0.000 description 26
- 230000003287 optical effect Effects 0.000 description 16
- 230000001965 increasing effect Effects 0.000 description 9
- 238000013461 design Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000005855 radiation Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 230000015556 catabolic process Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000010791 quenching Methods 0.000 description 2
- 230000000171 quenching effect Effects 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000002800 charge carrier Substances 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 235000012431 wafers Nutrition 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
- G01S7/4863—Detector arrays, e.g. charge-transfer gates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The present disclosure relates to light detection and ranging sensors. The electro-optical device (18) comprises a laser light source (20), a beam control device (24) and an array (28) of sensing elements (44), the laser light source (20) emitting at least one light pulse, the beam control device (24) transmitting and scanning the at least one light beam across the target scene (22). Each sensing element outputs a signal indicative of the time of incidence of a single photon on the sensing element. Light collection optics (27) image the target scene scanned by the transmitted light beam onto an array. The circuit (50) is coupled to actuate the sensing elements only in a selected region (70) of the array and to sweep across the selected region of the array in synchronization with scanning the at least one beam.
Description
The present application is a divisional application of the patent application of the invention with the application number 201680074428.8, the application date 2016, 12 and 8, and the name of the light detection and ranging sensor.
Technical Field
The present invention relates generally to optoelectronic devices, and in particular to light detection and ranging (lidar) sensors.
Background
There is an increasing demand for real-time three-dimensional imagers for existing and emerging consumer applications. These imaging devices, also commonly referred to as light detection and ranging (lidar) sensors, enable remote measurement of the distance (and typically intensity) of each point on the target scene, known as target scene depth, by illuminating the target scene with a light beam and analyzing the reflected optical signals. A common technique for determining the distance to each point on the target scene involves sending a beam of light to the target scene, followed by a round trip time measurement, i.e., time of flight (ToF), the time it takes for the beam of light to travel from the light source to the target scene and back to the detector in the vicinity of the light source.
A suitable detector for ToF-based LiDAR is provided by a Single Photon Avalanche Diode (SPAD) array. SPADs, also known as geiger-mode avalanche photodiodes (GAPDs), are detectors capable of capturing single photons with very high time-of-arrival resolution, on the order of tens of picoseconds, which can be fabricated using proprietary semiconductor processes or standard CMOS processes. SPAD sensor arrays fabricated on a single chip have been used for experiments in 3D imaging cameras. Charbon et al, in "SPAD-Based Sensors" (published in TOF Range-Imaging Cameras (Springer-Verlag, 2013)) provide a useful review of SPAD technology, which is incorporated herein by reference.
In SPAD, the p-n junction is reverse biased at a level well above the breakdown voltage of the junction. Under this bias, the electric field is so high that a single charge carrier injected into the depletion layer due to an incident photon can trigger a self-sustaining avalanche. The leading edge of the avalanche current pulse marks the detected photon arrival time. The current continues until the avalanche is quenched by lowering the bias voltage to or below the breakdown voltage. The latter function is performed by a quenching circuit that may include only a high resistance rectifier load in series with the SPAD, or alternatively an active circuit element.
Disclosure of Invention
Embodiments of the invention described herein below provide improved LiDAR sensors and methods of use thereof.
Thus, according to an embodiment of the present invention, there is provided an electro-optical device comprising a laser light source configured to emit at least one light pulse, a beam control device configured to transmit and scan the at least one light beam across a target scene, and an array of sensing elements. Each sensing element is configured to output a signal indicative of a time of incidence of a single photon on the sensing element. The light collection optics are configured to image a target scene scanned by the transmitted light beam onto the array. The circuit is coupled to actuate the sensing elements in only a selected region of the array and sweep across the selected region of the array in synchronization with the scanning of the at least one light beam.
In some embodiments, the circuitry is configured to select the region such that at any time during the scan the selected region comprises a portion of the array onto which the light collection optics image a region of the target scene illuminated by the at least one light beam. The selected region may include a sensing element or a plurality of sensing elements.
In the disclosed embodiments, the circuitry is configured to process the signals output by the sensing elements in order to determine the respective distances of points in the target scene. Typically, the sensing element comprises a single photon detector, such as a Single Photon Avalanche Diode (SPAD).
In some embodiments, the laser light source is configured to emit at least two light beams along different respective beam axes such that at any time during a scan the light collection optics image respective areas of the target scene illuminated by the at least two light beams onto different respective sensing elements. In these embodiments, the beam control device is configured to scan the at least two beams across the target scene in a two-dimensional scan, and the circuit is configured to sweep a selected area on the array in a two-dimensional pattern corresponding to the two-dimensional scan. For example, the two-dimensional scan may form a raster pattern, wherein respective beam axes of the at least two beams are laterally offset from each other with respect to a scan line direction of the raster pattern.
Alternatively, the beam control device is configured to scan the at least two beams across the target scene in a first direction in a linear scan, and the at least two beams comprise a plurality of beams arranged along a column axis in a second direction perpendicular to the first direction. In one embodiment, the plurality of light beams are arranged in at least two columns having respective column axes orthogonal to a first direction of the scan and offset from each other.
There is also provided, in accordance with an embodiment of the present invention, a method for sensing, including emitting at least one beam of light pulses and transmitting and scanning the at least one beam across a target scene. An array of sensing elements is provided, each configured to output a signal indicative of a time of incidence of a single photon on the sensing element. A target scene scanned by the transmitted beam is imaged onto the array. The sensing element is actuated only in a selected area of the array and sweeps across the selected area of the array in synchronism with the scanning of the at least one light beam.
The invention will be more fully understood from the following detailed description of embodiments of the invention, taken together with the accompanying drawings, in which:
drawings
FIG. 1 is a schematic diagram of a LiDAR system according to an embodiment of the present invention.
FIG. 2 is a block diagram schematically illustrating a SPAD-based sensing device in accordance with an embodiment of the present invention;
FIG. 3 is a block diagram illustrating components of sensing elements in a SPAD array according to an embodiment of the invention;
FIG. 4 is a block diagram schematically illustrating a SPAD array with a scanned sensitive region in accordance with an embodiment of the present invention;
FIG. 5 is a schematic diagram of a detector array with circular scanning illumination spots according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a detector array with circular scanning illumination spots according to another embodiment of the present invention;
FIGS. 7A-C are schematic illustrations of a detector array having elliptical scanning illumination spots according to another embodiment of the present invention;
FIG. 8 is a schematic diagram of a detector array having two circular illumination spots scanned in a two-dimensional raster scan, in accordance with an embodiment of the present invention;
FIG. 9 is a schematic diagram of a detector array having an interleaved array of scanned illumination spots in a one-dimensional scan according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a LiDAR device implementing one-dimensional scanning in accordance with an embodiment of the present invention;
FIG. 11 is a schematic diagram of a LiDAR device implementing one-dimensional scanning in accordance with another embodiment of the present invention;
FIG. 12 is a schematic diagram of a LiDAR device using a laser source with adjustable emission power according to an embodiment of the present invention; and
FIG. 13 is a schematic diagram of a LiDAR device using two laser sources with different emission powers according to an embodiment of the present invention.
Detailed Description
SUMMARY
The quality of measuring distances from each point in the target scene (target scene depth) using LiDAR is often compromised in practical implementations by a range of environmental, basic, and manufacturing challenges. An example of an environmental challenge is the presence of irrelevant background light in indoor and outdoor applications, such as solar ambient lightTypically up to 1000W/m 2 Is a radiation rate of (a) to (b). The fundamental challenges are related to losses caused by reflected light signals from the surface of the target scene, particularly due to the low reflectivity of the target scene and the limited optical collection aperture and the electronic and photonic shot noise. These limitations often create inflexible trade-offs, which often motivate designers to employ solutions involving large optical apertures, high optical power, narrow field of view (FoV), bulky mechanical construction, low frame rates, and limitations of sensor operation in a controlled environment.
Embodiments of the invention described herein address the above limitations in order to enable compact, low cost LiDAR that can operate in an uncontrolled environment with accurate high resolution depth imaging. The disclosed embodiments use one or more pulsed laser sources to emit a beam of light to generate a high-emissivity illumination spot at the intersection of the axis of the emitted beam of light and the target scene. The beam and thus the illumination spot is scanned across the target scene. Illumination reflected from the target scene is imaged by collection optics onto a time-of-flight single photon detector array for high signal-to-noise ratio, where the distance to each point of the target scene is derived from the time-of-flight data.
Imaging the target scene onto the detector array, as is known in the art, produces a one-to-one correspondence between locations in the target scene and locations on the detector array defined by the geometry optics. Thus, the area of the target scene is imaged onto a corresponding image area on the detector, the linear length in the image being given by multiplying the corresponding length in the target scene area by the optical magnification M, where typically M < <1 for a LiDAR system. Similarly, the sensing elements of the detector array may be considered to image back into the target scene at a magnification of 1/M, given the location and area of the target scene "seen" by the sensing elements.
In the disclosed embodiments, the detector array comprises a two-dimensional array of single photon time sensitive sensing elements, such as Single Photon Avalanche Diodes (SPADs). Each SPAD is individually addressed by a dedicated control circuit, the sensitivity (including on/off state) of each SPAD being controlled by its specific inverted p-n junction high voltage. In some embodiments, the SPAD operates as a separate sensing element, while in other embodiments, multiple SPADs are combined together into a super pixel. At any point during the scanning process, only sensing elements in one or more areas of the array that receive reflected illumination from the scanning beam are actuated. Thus, the sensing element is only activated when its signal provides useful information. This approach reduces both the background signal (which will reduce the signal to background ratio) and the power requirements of the detector array.
LiDAR utilizes a finite average area associated with each point to measure the distance of a set of discrete points to a target scene. In the disclosed embodiments, the measured parameters and actuation of the sensing element are affected by the following system parameters of LiDAR:
1) The size of the illumination spot is chosen to be the same,
2) Resolution of the beam steering device (the size of the step or offset of the beam steering device between successive measurements of distance), and
3) The size of the super-pixels of the detector array, or in other words the number of sensing elements bundled together in a ToF measurement (including the case where one sensing element is used as a super-pixel).
The effects of LiDAR system parameters can be divided into two cases:
a) A small spot condition, wherein the illumination spot is smaller than the size of the superpixel, an
b) A large spot case, where the illumination spot is larger than the size of the superpixel, is compared in size by observing both the illumination spot and the superpixel in the same optical plane (in the target scene or detector array). These two cases can be summarized in the following table, which are further detailed in the context of the figures.
Table 1: influence of LiDAR System parameters
In some embodiments of the invention, the target scene is illuminated and scanned by a laser beam or beams of light. In some embodiments utilizing multiple beams, the beams are generated by splitting the laser beam using diffractive optical elements, prisms, beam splitters, or other optical elements known in the art. In other embodiments, several separate laser sources are used to generate the multiple beams. In some of these embodiments, a monolithic laser array (such as a VCSEL array or a VECSEL array) is used to generate the multiple beams.
In some embodiments, a beam control device, such as a scanning mirror, is operated to scan a target scene with a single beam in a two-dimensional raster scan. (raster scanning typically involves long, nearly straight-line back and forth scans, so-called scan lines, and short movements that shift a scan point from one scan line to the next. Raster patterns are described herein by way of example, and alternative scan patterns implementing similar principles are considered to be within the scope of the present invention. When a single light beam is used, the scanning resolution in the direction perpendicular to the scanning lines of the raster scan is given by the spacing between successive scanning lines. Scan resolution can be increased by reducing the spacing between successive scan lines, but such resolution increase would be at the cost of a reduced frame rate, as a greater number of scan lines are required to cover the scene. Alternatively, if the number of scan lines per frame is unchanged, the resolution may be increased, but at the cost of a reduced field of view. Mechanical constraints limit the extent to which the scan speed of the mirror can be increased to counteract these effects.
In an embodiment, the scanning resolution in a direction perpendicular to the scan line is increased by using a plurality of light beams that extend laterally with respect to the scan line direction and in the scan line direction. The spacing of the beams along the scan line is configured such that each beam illuminates a separate superpixel on the detector array to identify each beam individually. The lateral spacing of the beams instead of the scan line density now determines the scan resolution. The disclosed embodiments enable an increase in lateral resolution without reducing the size of the sensing element, thereby alleviating the need for miniaturization of the detector array.
In another embodiment, a plurality of illumination spots are scanned across a target scene in a linear scan. The use of one-dimensional linear scanning (in which case linear scanning includes scanning along a single direction, the scan line being deformed from a straight line due to optical or mechanical imperfections.) allows the use of a simpler and cheaper beam steering device than two-dimensional scanning, but the number of beams covering the target scene with a sufficiently high resolution is generally higher than required for two-dimensional scanning. Single column scanning may be achieved by configuring multiple light beams in a column perpendicular to the scan line, thereby generating a column of illumination spots. When each illumination spot is imaged onto a separate sensing element in the detector array, the highest scanning resolution in the column axis direction is obtained.
In another embodiment utilizing linear scanning, the scanning resolution perpendicular to the scan lines is increased by creating multiple columns of illumination spots perpendicular to the scan lines and offset from each other in the direction of the column axis. The plurality of columns are also offset from each other in the direction of the scan line by at least one sensing element such that each illumination spot illuminates a separate sensing element and thus allows each illumination spot to be identified separately. This embodiment achieves an increase in lateral resolution without reducing the size of the sensing element, thereby alleviating the miniaturization requirements of the detector array.
Some embodiments of the invention provide LiDAR systems with wide field of view (FoV) that cover a large depth range. Because the implementation of high efficiency, wide FoV optics results in bulky and expensive components, these embodiments apply dedicated designs and modes of use of laser light sources, detector arrays, electronics and algorithms to measure scene depth and distance over a wide range of FoV while maintaining optical design and structural simplicity.
The considerations of a laser source are related to its emitted power: if only a low-emission power laser light source is used for target scene scanning, the signal received by the detector array from the far-end point of the target scene is too weak for robust and accurate measurements. On the other hand, if only a high-emission power laser source capable of measuring distant target sites is used, the LiDAR will use unnecessarily high emission power for nearby target sites, thereby increasing the power consumption of the LiDAR. Thus, in some embodiments of the invention, the laser source emission power is adjusted based on the measured distance.
System description
FIG. 1 schematically illustrates a LiDAR system 18 according to an embodiment of the present invention. The beam or beams from a laser light source 20 comprising one or more pulsed lasers are directed through a biaxial beam steering device 24 to a target scene 22 where an illumination spot 26 is formed and scanned. (the term "light" is used herein to refer to any kind of optical radiation, including radiation in the visible, infrared and ultraviolet ranges). The beam steering means may comprise, for example, a scanning mirror, or any other suitable type of optical deflector or scanner known in the art. The illumination spot 26 is imaged by collecting optics 27 onto a two-dimensional detector array 28, which two-dimensional detector array 28 comprises single photon time sensitive sensing elements such as SPADs.
In addition to the illumination spot 26, the target scene 22 is illuminated by an ambient light source 36, such as the sun. To achieve a high signal-to-background ratio, the emissivity of the illumination spot is chosen to be much higher than the emissivity of ambient illumination, e.g. due to the emissivity from the sun, the emissivity of ambient illumination can reach up to 1000W/m 2 . The band pass filter 37 is used to further reduce the ambient illumination on the detector array 28.
The control circuit 38 is connected to the laser source 20, clocks the pulse emissions and controls their emission power, and to the dual-axis beam control device 24, controlling the scanning of the illumination spot 26. In addition, the control circuit 38 dynamically adjusts the reverse p-n junction high voltage of each SPAD of the detector array 28 to control the excitation and sensitivity of each SPAD. With the known timing of the pulses from the laser light source 20 and the known state of the biaxial beam control device 24 determining the position of the illumination spot 26 on the target scene 22, the control circuit 38 actuates only those SPADs onto which the illumination spot was imaged by the collection optics 27 at any given moment. Further utilizing the above knowledge of the laser light source 20 and the beam control device 24, and the signals read from the detector array 28, the control circuit 38 uses the measured time of flight from the laser source to the detector array to determine the distance to each scan point in the target scene 22.
Fig. 2-4 schematically illustrate the structure and function of a detector array 28 according to an embodiment of the invention. These figures illustrate one possible approach that may be used to selectively actuate SPAD-based sensing elements in an array using a combination of global and local bias control. Alternatively, other kinds of biasing and actuation schemes and other kinds of single photon sensing elements may be used for these purposes.
Fig. 2 is a block diagram schematically illustrating a detector array 28 according to an embodiment of the invention. As described further below, the detector array 28 includes sensing elements 44, each sensing element 44 including SPADs and associated bias and processing circuitry. The global high voltage bias generator 46 applies a global bias voltage to all of the sense elements 44 in the array 28. In addition, the local bias circuit 48 in each sense element 44 applies an excess bias that adds to the global bias in the sense element. The sense element bias control circuit 50 sets the excess bias voltage applied by the local bias circuit 48 to the corresponding value in the different sense elements. The global high voltage bias generator 46 and the sense element bias control circuit 50 are both connected to the control circuit 38 (fig. 1).
Fig. 3 is a block diagram illustrating components of one of the sensing elements 44 in the array 28 according to an embodiment of the invention. In the disclosed embodiment of the invention, the array 28 includes a two-dimensional matrix of sensing elements formed on a first semiconductor chip 52, with a second two-dimensional matrix of bias control and processing circuits formed on a second semiconductor chip 54. Chips 52 and 54 are coupled together (only a single element of each of the two matrices is shown) such that the two matrices are in one-to-one correspondence, whereby each sensing element on chip 52 is in contact with a corresponding bias control and processing element on chip 54.
Based on SPAD sensor designs known in the art and accompanying bias control and processing circuitry as described herein, both chips 52 and 54 can be fabricated from silicon wafers using well known CMOS fabrication processes. Alternatively, the detection designs and principles described herein may be implemented using other materials and processes where necessary. For example, all of the components shown in fig. 3 may be formed on a single chip, or the distribution of components may be different between chips. All such alternative implementations are considered to be within the scope of the invention.
The sensing element 44 includes SPAD 56 that includes a photosensitive p-n junction, as is known in the art. Peripheral circuitry, including quenching circuit 58 and local bias circuit 48, is typically located on chip 54. As described above, the actual bias applied to the SPAD 56 is the global bias V provided by the bias generator 46 (FIG. 2) bias And the sum of the excess bias applied by the bias circuit 48. The sense element bias control circuit 50 (FIG. 2) sets the excess bias to be applied to each sense element by setting a corresponding digital value in the bias memory 60 on the chip 54.
In response to each captured photon, SPAD 56 outputs an avalanche pulse that is received by processing circuitry on chip 54, including digital logic 62 and memory configured as output buffer 64. For example, the processing elements may be configured to function as a time-to-digital converter (TDC) that measures the delay of each pulse output by the SPAD 56 relative to a reference time and outputs a digital data value corresponding to the delay. Alternatively, logic 62 and buffer 64 may measure and output other kinds of values, including but not limited to, a histogram of pulse delay times, a binary waveform, or a multi-level digital waveform. The output of the chip 54 is connected to the control circuit 38 (fig. 1).
Fig. 4 is a block diagram schematically illustrating a SPAD array 28 having a scan region 70 of sensitivity according to an embodiment of the present invention. In this case, the bias control circuit 50 sets the bias voltage of the sensing element 72 within the region 70 to a higher value than the remaining sensing elements 76, wherein the bias voltage is set such that the sensing elements 76 are turned off. However, bias control circuit 50 dynamically modifies the bias voltage of sense element 48 to sweep region 70 across the array, as indicated by the arrows in the figure. For example, circuitry 50 may raster scan across region 70 (as shown in the figures below) in synchronization with the scanning of the laser beam across the target scene imaged onto array 28.
As previously mentioned, this embodiment is particularly useful when adjusting the sensitive area of the array 28 to the shape of the illuminating beam or the region of interest in the target scene being imaged, thus maximizing the sensitivity of the array 28 to power consumption while reducing background noise from sensing elements that would not contribute to the signal.
In an alternative embodiment of the present invention (e.g., as shown in fig. 9), bias control circuit 50 sets the local bias voltages such that region 70 has a linear shape, extends along one or more columns of array 28, and matches the linear shape of the illumination beam or beam array. Circuitry 50 may then sweep this linear region 70 across array 28 in synchronization with the illumination beam. Alternatively, other scan patterns may be implemented, including conventional scan patterns and adaptive scan patterns.
Example scanning pattern and superpixel
Fig. 5 is a schematic diagram illustrating a detector array 28 having an image of a circular scanning illumination spot 26 (fig. 1) superimposed on the array, in accordance with an embodiment of the present invention. At three consecutive time points: t=t i-1 ,t=t i Sum t=t i+1 A moving image of the illumination spot 26 projected by the collection optics 27 on the detector array 28 is observed. The images of the scanning illumination spot 26 at these three successive points in time are represented by circles 84,86 and 88, respectively, whose diameter is twice the pitch of the sensing element 44 in this embodiment. Arrow 90 indicates the scanning direction of the image of the scanning illumination spot 26, wherein the expected position of the image of the scanning illumination spot is determined from knowledge of the state of the beam control device 24.
At each point in time, the sensing element 44 in the area of the array 28 that best matches the position of the image of the illumination spot 26 at that point in time is actuated. These actuated sensing elements may be considered a "super pixel". In the embodiment shown in fig. 5, each superpixel comprises an array of 2x2 sensing elements, but in some embodiments the dimension of the superpixel takes on other values in a static or dynamic manner.
At time t=t i-1 Where the superpixel 92 is actuated (including circle 84); at time t=t i At which the superpixel 94 is actuated (circle 86); at time t=t i At this point, the superpixel 96 is actuated (circle 88). Thus, in the illustrated embodiment, each sensing element 44 is associated with two adjacent super-pixels. Only those sensing elements within the active superpixel are actuated at a given moment, the remaining sensing elements being turned off by reducing their bias voltage to a level where avalanche multiplication is not sustainable. This operation maximizes the collection of optical signals from the image of the scanned illumination spot 26 while reducing exposure to target scene background illumination independent of the illumination spot, thereby increasing the signal-to-background ratio of the array 28. In some embodiments of the present invention, the outputs of the sensing elements that are not illuminated by the image of the scan spot 26 are masked using standard logic gates.
The lateral resolution of the target scene 22 in the scan direction is determined by the discrete steps of the scan (as determined by the scan speed and laser pulse repetition rate), which in this embodiment is one pitch of the sensing elements 44. The region where the target scene distance is averaged (approximately) is the region of the super-pixel.
Fig. 6 is a schematic diagram illustrating a detector array 28 having an image of a circular scanning illumination spot 26 (fig. 1) superimposed on the array according to another embodiment of the present invention. At three consecutive time points: t=t i-1 ,t=t i Sum t=t i+1 A moving image of the illuminated spot is observed. The diameter of the image of the scanned illumination spot and the scanning step between two successive time points are both half the pitch of the sensing element 44. Images of the scanned illumination spot 26 for three consecutive points in time are represented by circles 100,102 and 104, respectively. Arrow 105 indicates the scanning direction, based on knowledge of the state of the beam control device 24The expected location of the image is determined. In this embodiment, a superpixel of a single sensing element 44 is used for t=t i-1 The super pixel 106 is actuated and for t=t i And t=t i+1 Both actuate the super pixel 108. The lateral resolution of the image of the target scene 22 in the scan direction is half the pitch of the sensing elements 44, and the area from the averaged target scene is the area of the illumination spot 26.
Fig. 7A-C are schematic diagrams illustrating a detector array 28 having an image of an elliptical scanning illumination spot 26 (fig. 1) superimposed on the array according to another embodiment of the present invention. For example, an elliptical illumination spot is obtained from an edge-emitting laser diode in which the emission junction cross-section is rectangular with a high aspect ratio. In this embodiment, an elliptical illumination spot 26 is shown with an aspect ratio of 3 to 1, but other aspect ratios may be used in other embodiments. The so-called fast axis (long dimension) of the elliptical image of the illumination spot 26 on the detector array 28 is approximately six times the pitch of the detector elements 44, and the slow axis (short dimension) is approximately twice the pitch. Fig. 7A-C schematically show a method similar to that of fig. 5-6 at three successive time points: t=t i-1 ,t=t i And t=t i+1 A moving image of the illumination spot 26 at. Each scan step on detector array 28 is one pitch of sensing elements 44. In this embodiment, super pixels of 2x2 sensing elements are used.
Fig. 7A schematically shows an illumination spot 110, which is a time t=t i-1 The scan at that point illuminates the image of the spot 26. Based on the expected position of the illumination spot 110, the superpixels actuated at this time are pixels 112,114 and 116 (the furthest top and bottom tips of the illumination spot are ignored because they contribute very little to the signal). Arrow 118 indicates the direction of scanning, wherein the intended position of the illumination spot 110 is determined from knowledge of the state of the beam control device 24.
Fig. 7B schematically shows an illumination spot 120, which is at time t=t i An image of the scanned illumination spot 26. Based on the expected position of the illumination spot 120, the superpixels that are actuated at this time are 112,114,116, and 122. At presentAt actuation of four superpixels, because a significant portion of the illumination spot 120 (the top of the ellipse) is still within the pixel 112, and another significant portion (the bottom of the ellipse) has entered the pixel 122. The super-pixels 112,114 and 116 continue to collect signals to improve the signal-to-noise ratio. As in fig. 7A, arrow 118 indicates the direction of scanning, at time t=t i The desired position of the illumination spot 120 is determined based on knowledge of the state of the beam control device 24.
Fig. 7C schematically shows an illumination spot 124, which is at time t=t i+1 An image of the scanned illumination spot 26. Based on the expected position of the illumination spot 124, the superpixels actuated at this time are now 114,116 and 122. Now only three super-pixels are actuated, as the pixels 112 (fig. 7B) are no longer illuminated by any significant portion of the illuminated spot 124. As shown in fig. 7A-B, arrow 118 indicates the direction of scanning, where at t=t i+1 The desired position of the illumination spot 124 is determined based on knowledge of the state of the beam control device 24. In the illustrated embodiment, each superpixel will be exposed to the image of the illumination spot 26 for seven scan steps, thus improving the signal-to-noise ratio.
Since the length of an elliptical illumination spot is much larger than a super-pixel, the resolution in the scanning direction is determined by the super-pixel size. Since the super-pixel size is one third of the length of the elliptical illumination spot along its fast (long) axis, the resolution obtained in the scan line direction is three times (one third of the value) the data obtained with the elliptical illumination spot alone. The average area for distance measurement is the area of the superpixel.
In fig. 5-7, the ideal shape (circular or elliptical) has been used as the shape of the image of the illumination spot 26 on the detector array 28. In an embodiment of the present invention, the control circuit 38 calculates (or looks up) the actual shape of the illuminated spot image on the detector array, and the result of this calculation is used to select the sensor element to be actuated at each point of the scan. This calculation takes into account the design of the beam steering device 24, its scanning motion characteristics, the precise state of the beam steering device, and the effect of the angle between the beam from the laser source 20 and the beam steering device, as they affect the shape, direction of motion, and orientation of the image of the illumination spot 26. In addition, the dependence of the image on the distance between the LiDAR device and the target scene 22 is considered. This effect is significant, especially for a target scene range that is short compared to the separation distance between the beam steering device 24 and the collection optics 27. The above calculations are performed in order to obtain the best overlap between the actuated sensing element 44 and the image of the illumination spot 26 on the detector array 28, while achieving the desired vertical and horizontal angular resolutions, thereby optimizing the signal-to-background ratio and signal-to-noise ratio.
FIG. 8 is a schematic diagram illustrating a technique for enhancing resolution of raster-scanned LiDAR in accordance with an embodiment of the present invention. The beam control device 24 scans the image of the illumination spot 26 (FIG. 1) on the detector array 28 in a raster scan pattern 130, scanning one column of the detector array downward and scanning the next column of the detector array upward. If only one illumination spot is used, the lateral resolution of the scan line perpendicular to the raster scan will be the pitch of the sensing elements 44. However, in this embodiment, the lateral resolution is doubled by using two scanning illumination spots 26 whose images on the detector array 28 are spaced apart along the scan line by a distance equal to the pitch of the sensing elements 44 and are transverse to the pitch of half the scan line. The repetition rate of the beam control device 24 and the laser light source 20 is configured such that successive illumination spots are spaced apart in the direction of the raster scanned scan line by steps of half the pitch of the sensing element 44. Each super pixel includes a sensing element 44.
Fig. 8 schematically shows two consecutive time points t=t i And t=t i+1 Two images of the illumination spots 26 at the same. At time t=t i Here, the image of the illuminated blobs is blob 132 and blob 134, with blob 132 within superpixel 136 and blob 134 within superpixel 138. All other super-pixels are turned off. At time t=t i+1 At this point, the two blobs have moved down by half of the superpixel to new locations 142 and 144, as indicated by arrow 140. At t=t i The point of presence is still within the same superpixels 136 and 138, but at time t=t i+1 Spot 142 for spot illuminationAnd 144 is determined by the state of the beam steering device 24. Due to the fact that two spots are always assigned to separate superpixels, these spots are individually identifiable and the resolution of the LiDAR transverse to the scan line is determined by the fact that the spacing of the images of the two illumination spots 26 in this direction is not the pitch of the sensing elements 44, thus alleviating the need for miniaturization of the detector array 28. The average area of the distance measured by each illumination spot 26 is the area of that illumination spot.
In another embodiment (not shown in the figures), the number of scanning illumination spots 26 is increased to more than two (as compared to fig. 8), wherein the illumination spots are separated along the raster scan pattern 130 such that the image of each illumination spot is located in a different sensing element 44. For embodiments in which the images of the N illumination spots 26 are all within a column of the detector array 28, the resolution transverse to the raster scan 130 is given by dividing the pitch of the sensing elements 44 by N.
Linear scan pattern
FIGS. 9-11 are schematic diagrams illustrating linear scanning based LiDAR according to embodiments of the present invention. An advantage of linear (one-dimensional) scanning is that it utilizes a smaller, cheaper and more reliable design of beam steering means than is required for two-dimensional scanning. The resolution of the linear scanning direction depends on the resolution of the beam steering means. Since no scanning occurs transverse to the linear scanning direction, resolution in that direction is achieved by using a plurality of illumination spots 26 arranged on the target scene 22.
Fig. 9 is a schematic diagram illustrating a one-dimensional scan imaged onto detector array 28 according to an embodiment of the present invention. The resolution of LiDAR in a direction perpendicular to the linear scan is improved beyond the pitch of the sensing elements 44 by using a pattern 150 of images of the illumination spots 26 that includes two staggered columns 151 and 152, with circles 153 representing the expected locations of the images of the individual illumination spots on the sensor array 28. Arrow 154 indicates the direction of scanning.
In each column 151 and 152 of pattern 150, the spacing of the images of illumination spots 26 along the axis of the respective column (as indicated by circle 153) is equal to the pitch of sensing elements 44. The two columns 151 and 152 are offset from each other in the direction of the column axis by half the pitch of the sensing elements 44. Columns 151 and 152 are spaced apart by a spacing in the scan direction so that two columns are allocated to separate sensing elements. In some embodiments (not shown in the figures), the resolution transverse to the linear scan is further improved by using more than two columns of illumination spots 26 with smaller mutual offsets in the direction of the axes of the columns. Thus, for example, a quarter-pitch resolution is achieved using four columns of sensing elements 44 that are offset from each other with a quarter-pitch.
FIG. 10 is a schematic diagram illustrating a one-dimensional scanning based LiDAR 159 according to an embodiment of the present invention. The beam from a single pulsed laser source 160 is split into two staggered columns of multiple beams by a Diffractive Optical Element (DOE) 162. These beams are directed onto and scanned over the target scene 22 by the single axis beam steering device 166, thereby forming two interleaved columns of illumination spots 168 on the target scene 22. The illumination spot is imaged by collection optics 27 onto detector array 28, forming two staggered columns 151 and 152 in pattern 150 as shown in FIG. 9.
Only the sensing elements 44 containing the image of the illumination spot 26 in the pattern 150 are actuated at any given moment during the scan, the remaining sensing elements being turned off, thereby preventing unnecessary integration of background light and achieving a high signal-to-background ratio. Similar to fig. 1, the control circuit 38 is connected to the laser light source 160, the beam control device 166, and the detector array 28, controls their functions, and gathers data to determine the distance to the target scene 22 by using time-of-flight data.
FIG. 11 is a schematic diagram illustrating LiDAR170 based on one-dimensional scanning and coaxial optical structures in accordance with another embodiment of the present invention. The beam from a single pulsed laser source 160 is split into two staggered columns of multiple beams by DOE 162. These beams pass through the polarizing beam splitter 176 and are directed by the single axis beam steering device 166 to the target scene 22 and scanned thereon, thereby forming two staggered columns of illumination spots 168. The illumination spot reflected from the target scene 22 is imaged onto the detector array 28 by the beam steering device 166, polarizing beam splitter 176 and collection optics 27, thereby forming two staggered columns 151 and 152 in the pattern 150, as shown in fig. 9.
The pattern 150 on the detector array 28 is (almost) stationary with respect to scanning due to the coaxial architecture of optical emission and collection. Thus, the number of columns of sensor elements 44 on the detector array along an axis perpendicular to the scan direction may be significantly smaller than the number of rows of sensor elements along the scan direction. Similar to fig. 1, the control circuit 38 is connected to the laser light source 160, the beam control device 166, and the detector array 28, controls their functions, and collects data to determine the distance to the target scene 22 using the time-of-flight data.
In both embodiments shown in fig. 10 and 11, the lateral resolution perpendicular to the scan direction is half the pitch of the sensing elements 44, and the resolution along the scan is determined by the scan rate of the beam steering device 166 and the pulse repetition rate of the laser source 160. Each of the illumination spots 168 makes an average distance measurement over the area of that spot.
The vertical orientations of columns 151 and 152 in pattern 150 are shown here by way of example, and alternate orientations embodying similar principles are considered to be within the scope of the present invention.
Multi-range sensing
FIGS. 12-13 are schematic diagrams illustrating short and long range LiDARs that adapt themselves to a target scene, according to embodiments of the present invention.
FIG. 12 is a schematic diagram illustrating a LiDAR 199 according to an embodiment of the invention, the LiDAR 199 lends itself to measuring distances to both near and far target sites. The beam of the pulsed laser light source 200 is directed by a biaxial beam control device 24 to the target scene 22, forming an illumination spot 206 on the target scene and scanning the spot over the target scene. The illumination spot 206 is imaged by the collection optics 27 onto the detector array 28. The control circuit 38 is connected to the laser source 200, the beam control device 24 and the detector array 28.
Under control of the signal from the control circuit 38, the laser light source 200 has the capability to emit light at two power levels: low transmit power and high transmit power. At the same time, the sensing elements 44 of the detector array 28 (see FIG. 2) have the capability to operate in two different modes: short range mode and long range mode. For a given mode of operation of a particular sensing element, control circuit 38 will adjust its timing and sensitivity, as well as the signal processing algorithm for optimal performance in that mode. Typically, in short range mode, the sensing element 44 is biased to obtain relatively low sensitivity (which also results in low noise) and is gated to sense a short time of flight. In the long range mode, the sensing element 44 is biased for relatively high sensitivity and is gated to sense longer flight times, thereby reducing the likelihood of false detection of short range reflections.
To determine the required mode of operation for each region of the target scene 22, the region is first scanned at its low emission power level using the laser source 200 to accommodate short range detection. The sensing elements 44 in the detector array 28 that receive light from the laser source 200 are actuated with timing, sensitivity, and associated signal processing algorithms that are set for short-range measurements.
After this short range scan, control circuitry 38 controls LiDAR 199 to perform long Cheng Saomiao only in areas where a sufficiently robust distance measurement is not produced based on a predetermined standard short range low power scan. In a long scan, the measurements of these regions are repeated using the high emitted power level of the light source 200, and the timing, sensitivity, and algorithm of the sensing elements 44 actuated to receive reflected light from these regions are appropriately changed.
FIG. 13 is a schematic diagram illustrating a LiDAR 210 according to another embodiment of the present invention, the LiDAR 210 lends itself to measuring distance to both near and far target sites. The beams of the two pulsed laser light sources 218 and 220 are directed by the dual-axis beam control device 24 to the target scene 22, forming an illumination spot 226 on the target scene 22 and scanning the illumination spot across the target scene 22. (the spacing between laser sources 218 and 220 is exaggerated in FIG. 13 to show two separate sources). As described in detail below, only one laser source emits at a given time. The illumination spot 226 is imaged by collection optics 27 onto detector array 28. The control circuit 38 is connected to the laser light sources 218 and 220, the beam control device 24 and the detector array 28.
Each laser source 218,220, when actuated, emits at a particular emitted power level, with the laser source 218 emitting at a low emitted power level and the laser source 220 emitting at a high emitted power level. The control circuit 38 selects which laser source to actuate at each point in the scan according to the kind of criteria described above with reference to fig. 12. Similarly, the sensing elements 44 of the detector array 28 (see FIG. 2) have the ability to operate in two different modes: short range mode and long range mode. For a given mode of operation of a particular sensing element 44, the control circuit 38 will adjust its timing and sensitivity and its signal processing algorithm to obtain optimal performance in that mode.
To determine the desired mode of operation in a given region of the target scene 22, the region is first scanned using a low-emission power laser source 218. Those sensing elements 44 in the detector array 28 that receive light from the laser source 218 are activated at the timing, sensitivity, and associated signal processing algorithms that they set for short-range measurements. As in the previous embodiment, if control circuit 38 determines that a sufficiently robust distance measurement cannot be made for a given area using laser source 218, then the measurement for that area is repeated using laser source 220 at a higher transmit power with appropriate changes in timing, sensitivity, and algorithm of those sensing elements 44 that are actuated to receive light from laser source 220.
It should be understood that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown or described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
Claims (31)
1. An electro-optical device, comprising:
a laser light source configured to emit at least one beam of light;
a beam control device configured to transmit and scan the at least one beam across a target scene;
an array of sensing elements, each sensing element configured to output a signal indicative of photon incidence on the sensing element;
light collection optics configured to image the target scene scanned by the transmitted light beam onto the array,
wherein the beam control means scans the at least one beam across the target scene with a scanning resolution and spot size smaller than the pitch of the sensing elements; and
circuitry coupled to actuate sensing elements only in a selected region of the array and to sweep the selected region across the array in synchronization with the scanning of the at least one light beam,
Wherein the laser light source is configured to emit at least two light beams along different respective beam axes such that at any time during scanning of the at least one light beam, the light collection optics image respective areas of the target scene illuminated by the at least two light beams onto different respective sensing elements, and
wherein the beam control device is configured to scan the at least two beams across the target scene in a two-dimensional scan, and the circuitry is configured to sweep a selected area across the array in a two-dimensional pattern corresponding to the two-dimensional scan,
wherein the two-dimensional scan forms a raster pattern, and wherein respective beam axes of the at least two beams are offset from each other laterally with respect to a scan line direction of the raster pattern.
2. The apparatus of claim 1, wherein the circuitry is configured to select an area such that, at any time during scanning of the at least one light beam, the selected area contains a portion of the array onto which the light collection optics image an area of the target scene illuminated by the at least one light beam.
3. The apparatus of claim 2, wherein the selected area comprises a sensing element.
4. The apparatus of claim 2, wherein the selected area comprises a plurality of sensing elements.
5. The apparatus of claim 1, wherein the circuitry is configured to process signals output by sensing elements in order to determine respective distances to points in the target scene.
6. The apparatus of any of claims 1-5, wherein the sensing element comprises a single photon detector.
7. The apparatus of claim 6, wherein the single photon detector is a Single Photon Avalanche Diode (SPAD).
8. A method for sensing, comprising:
at least one beam of emitted light;
transmitting and scanning the at least one light beam across a target scene;
providing an array of sensing elements, each sensing element configured to output a signal indicative of photon incidence on the sensing element;
imaging the target scene scanned by the transmitted beam onto the array,
wherein the at least one light beam is scanned across the target scene with a scanning resolution and spot size smaller than a pitch of the sensing elements; and
actuating sensing elements only in selected areas of the array, and sweeping the selected areas across the array in synchronization with the scanning of the at least one light beam,
Wherein emitting at least one light beam comprises emitting at least two light beams along different respective beam axes such that at any time during scanning of the at least one light beam, light collecting optics image respective areas of the target scene illuminated by the at least two light beams onto different respective sensing elements,
wherein scanning the at least one light beam comprises scanning the at least two light beams across the target scene in a two-dimensional scan, and actuating the sensing element comprises sweeping a selected area across the array in a two-dimensional pattern corresponding to the two-dimensional scan,
wherein the two-dimensional scan forms a raster pattern, and wherein respective beam axes of the at least two beams are offset from each other laterally with respect to a scan line direction of the raster pattern.
9. The method of claim 8, wherein actuating a sensing element comprises selecting an area such that at any time during scanning of the at least one light beam, the selected area comprises a portion of the array onto which light collection optics image an area of the target scene illuminated by the at least one light beam.
10. The method of claim 8, and comprising processing the signals output by the sensing elements to determine respective distances to points in the target scene.
11. The method of claim 8, wherein the sensing element comprises a single photon detector.
12. An electro-optical device, comprising:
at least one laser light source configured to emit at least one beam of light pulses having an emission power selectable between a low level and a high level;
a beam control device configured to transmit and scan the at least one beam across a target scene;
one or more sensing elements configured to output a signal indicative of the time of incidence of a single photon on the sensing element;
light collection optics configured to image the target scene scanned by the transmitted at least one light beam onto the one or more sensing elements; and
circuitry coupled to process signals output by the one or more sensing elements to determine respective distances to points in the target scene, control the at least one laser light source to emit the at least one light beam at a low level during a first scan of the beam control device over the target scene, identify points in the target scene for which the first scan does not make robust distance measurements, and control the at least one laser light source to emit the at least one light beam at a high level during a second scan of the beam control device after the first scan, while the beam control device directs the at least one light beam toward the identified points.
13. The apparatus of claim 12, wherein the at least one laser light source comprises a laser light source having an output selectable between the low level and the high level.
14. The apparatus of claim 12, wherein the at least one laser light source comprises at least two lasers, including at least a first laser configured to emit light pulses at a low level and at least a second laser configured to emit light pulses at a high level.
15. The apparatus of claim 12, wherein the circuitry is configured to set at least one of timing and sensitivity of a sensing element to different respective values during the first scan and during the second scan.
16. The apparatus of claim 12, wherein the circuitry is configured to control the at least one laser light source during the second scan to direct a high level of at least one light beam only to points for which the first scan does not make robust distance measurements.
17. The apparatus of any of claims 12-16, wherein the one or more sensing elements comprise an array of sensing elements, and wherein the circuitry is configured to actuate sensing elements only in a selected region of the array and sweep the selected region across the array in synchronization with the scanning of the at least one light beam.
18. The apparatus of claim 17, wherein the circuitry is configured to select an area such that at any time during the scan, the selected area contains a portion of the array onto which the light collection optics image an area of the target scene illuminated by the at least one light beam.
19. The apparatus of claim 17, wherein the at least one laser light source is configured to emit at least two light beams along different respective beam axes such that at any time during the scanning, the light collection optics image respective areas of the target scene illuminated by the at least two light beams onto different respective sensing elements.
20. The apparatus of any of claims 12-16, wherein the sensing element comprises a single photon detector.
21. The apparatus of claim 20, wherein the single photon detector is a Single Photon Avalanche Diode (SPAD).
22. A method for sensing, comprising:
selecting a laser source emission power between a low level and a high level;
transmitting at least one light beam of light pulses having a selected transmit power;
transmitting and scanning the at least one light beam across a target scene;
Providing one or more sensing elements, each sensing element configured to output a signal indicative of the time of incidence of a single photon on the sensing element;
imaging the target scene scanned by the transmitted at least one light beam onto the one or more sensing elements;
controlling at least one laser light source to emit the at least one light beam at a low level during a first scan of a beam control device over the target scene;
processing signals output by the one or more sensing elements to determine respective distances to points in the target scene;
identifying points in the target scene for which the first scan does not make robust distance measurements; and is also provided with
The at least one laser light source is controlled to emit the at least one light beam at a high level during a second scan of the beam control device after the first scan, while directing the at least one light beam towards the identified point.
23. The method of claim 22, wherein the at least one laser light source is controlled to switch an output level of the laser light source between the low level and the high level.
24. The method of claim 22, wherein controlling the at least one laser light source comprises alternately selecting an output of a first laser light source configured to emit light pulses at a low level and selecting an output of a second laser light source configured to emit light pulses at a high level.
25. The method of claim 22, comprising setting at least one of timing and sensitivity of a sensing element to different respective values during the first scan and during the second scan.
26. The method of claim 22, wherein controlling the at least one laser light source comprises controlling the at least one laser light source during the second scan to direct a high level of at least one light beam only to points for which the first scan does not make robust distance measurements.
27. The method of any of claims 22-26, wherein the one or more sensing elements comprise an array of sensing elements, and wherein the method comprises actuating sensing elements only in a selected region of the array and sweeping the selected region across the array in synchronization with the scanning of the at least one light beam.
28. The method of claim 27, wherein actuating a sensing element comprises selecting a region such that at any time during the scan, the selected region comprises a portion of the array onto which light collection optics image a region of the target scene illuminated by the at least one light beam.
29. The method of claim 27, wherein emitting at least one light beam comprises emitting at least two light beams along different respective light beam axes such that at any time during the scanning, light collection optics image respective areas of the target scene illuminated by the at least two light beams onto different respective sensing elements.
30. The method of any of claims 22-26, wherein the sensing element comprises a single photon detector.
31. The method of claim 30, wherein the single photon detector is a Single Photon Avalanche Diode (SPAD).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010063812.6A CN111239708B (en) | 2015-12-20 | 2016-12-08 | Light detection and ranging sensor |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/975,790 US9997551B2 (en) | 2015-12-20 | 2015-12-20 | Spad array with pixel-level bias control |
US14/975,790 | 2015-12-20 | ||
US201662353588P | 2016-06-23 | 2016-06-23 | |
US62/353,588 | 2016-06-23 | ||
CN201680074428.8A CN108431626B (en) | 2015-12-20 | 2016-12-08 | Light detection and ranging sensor |
CN202010063812.6A CN111239708B (en) | 2015-12-20 | 2016-12-08 | Light detection and ranging sensor |
PCT/US2016/065472 WO2017112416A1 (en) | 2015-12-20 | 2016-12-08 | Light detection and ranging sensor |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201680074428.8A Division CN108431626B (en) | 2015-12-20 | 2016-12-08 | Light detection and ranging sensor |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111239708A CN111239708A (en) | 2020-06-05 |
CN111239708B true CN111239708B (en) | 2024-01-09 |
Family
ID=57570664
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010063812.6A Active CN111239708B (en) | 2015-12-20 | 2016-12-08 | Light detection and ranging sensor |
CN201680074428.8A Active CN108431626B (en) | 2015-12-20 | 2016-12-08 | Light detection and ranging sensor |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201680074428.8A Active CN108431626B (en) | 2015-12-20 | 2016-12-08 | Light detection and ranging sensor |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP3391076A1 (en) |
JP (2) | JP6644892B2 (en) |
CN (2) | CN111239708B (en) |
WO (1) | WO2017112416A1 (en) |
Families Citing this family (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9741754B2 (en) | 2013-03-06 | 2017-08-22 | Apple Inc. | Charge transfer circuit with storage nodes in image sensors |
US9686485B2 (en) | 2014-05-30 | 2017-06-20 | Apple Inc. | Pixel binning in an image sensor |
US10761195B2 (en) | 2016-04-22 | 2020-09-01 | OPSYS Tech Ltd. | Multi-wavelength LIDAR system |
CN111682039B (en) | 2016-09-23 | 2021-08-03 | 苹果公司 | Stacked back side illumination SPAD array |
US10917626B2 (en) | 2016-11-23 | 2021-02-09 | Microsoft Technology Licensing, Llc | Active illumination 3D imaging system |
US10605984B2 (en) | 2016-12-01 | 2020-03-31 | Waymo Llc | Array of waveguide diffusers for light detection using an aperture |
US10502618B2 (en) | 2016-12-03 | 2019-12-10 | Waymo Llc | Waveguide diffuser for light detection using an aperture |
US10656251B1 (en) | 2017-01-25 | 2020-05-19 | Apple Inc. | Signal acquisition in a SPAD detector |
EP3574344B1 (en) | 2017-01-25 | 2024-06-26 | Apple Inc. | Spad detector having modulated sensitivity |
US10962628B1 (en) | 2017-01-26 | 2021-03-30 | Apple Inc. | Spatial temporal weighting in a SPAD detector |
JP7037830B2 (en) | 2017-03-13 | 2022-03-17 | オプシス テック リミテッド | Eye safety scanning lidar system |
CN113466882A (en) * | 2017-07-05 | 2021-10-01 | 奥斯特公司 | Optical distance measuring device |
EP3428574A1 (en) * | 2017-07-11 | 2019-01-16 | Fondazione Bruno Kessler | Device for measuring a distance and method for measuring said distance |
US10430958B2 (en) * | 2017-07-11 | 2019-10-01 | Microsoft Technology Licensing, Llc | Active illumination 3D zonal imaging system |
US10901073B2 (en) | 2017-07-11 | 2021-01-26 | Microsoft Technology Licensing, Llc | Illumination for zoned time-of-flight imaging |
WO2019014494A1 (en) * | 2017-07-13 | 2019-01-17 | Apple Inc. | Early-late pulse counting for light emitting depth sensors |
US10622538B2 (en) | 2017-07-18 | 2020-04-14 | Apple Inc. | Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body |
WO2019022941A1 (en) | 2017-07-28 | 2019-01-31 | OPSYS Tech Ltd. | Vcsel array lidar transmitter with small angular divergence |
US10698088B2 (en) | 2017-08-01 | 2020-06-30 | Waymo Llc | LIDAR receiver using a waveguide and an aperture |
US10677899B2 (en) | 2017-08-07 | 2020-06-09 | Waymo Llc | Aggregating non-imaging SPAD architecture for full digital monolithic, frame averaging receivers |
EP3451021A1 (en) | 2017-08-30 | 2019-03-06 | Hexagon Technology Center GmbH | Measuring device with scan functionality and adjustable receiving areas of the receiver |
US10440301B2 (en) | 2017-09-08 | 2019-10-08 | Apple Inc. | Image capture device, pixel, and method providing improved phase detection auto-focus performance |
US10473923B2 (en) * | 2017-09-27 | 2019-11-12 | Apple Inc. | Focal region optical elements for high-performance optical scanners |
JP7388720B2 (en) * | 2017-11-15 | 2023-11-29 | オプシス テック リミテッド | Noise-adaptive solid-state LIDAR system |
DE102018203534A1 (en) * | 2018-03-08 | 2019-09-12 | Ibeo Automotive Systems GmbH | Receiver arrangement for receiving light pulses, LiDAR module and method for receiving light pulses |
JP7324518B2 (en) | 2018-04-01 | 2023-08-10 | オプシス テック リミテッド | Noise adaptive solid-state lidar system |
CN112154348B (en) * | 2018-04-09 | 2024-08-23 | 奥卢大学 | Distance imaging apparatus and method |
JP2019191126A (en) * | 2018-04-27 | 2019-10-31 | シャープ株式会社 | Optical radar device |
DE102018113848A1 (en) * | 2018-06-11 | 2019-12-12 | Sick Ag | Optoelectronic sensor and method for acquiring three-dimensional image data |
US10848693B2 (en) | 2018-07-18 | 2020-11-24 | Apple Inc. | Image flare detection using asymmetric pixels |
US11019294B2 (en) | 2018-07-18 | 2021-05-25 | Apple Inc. | Seamless readout mode transitions in image sensors |
EP3608688B1 (en) * | 2018-08-09 | 2021-01-27 | OMRON Corporation | Distance measuring device |
EP3620822A1 (en) * | 2018-09-06 | 2020-03-11 | STMicroelectronics (Research & Development) Limited | Non-contiguous layouts for photosensitive apparatus |
US11914078B2 (en) * | 2018-09-16 | 2024-02-27 | Apple Inc. | Calibration of a depth sensing array using color image data |
US11237256B2 (en) * | 2018-09-19 | 2022-02-01 | Waymo Llc | Methods and systems for dithering active sensor pulse emissions |
CN112740065B (en) * | 2018-09-25 | 2024-06-25 | 苹果公司 | Imaging device, method for imaging and method for depth mapping |
US11233966B1 (en) | 2018-11-29 | 2022-01-25 | Apple Inc. | Breakdown voltage monitoring for avalanche diodes |
JP7172963B2 (en) * | 2018-12-14 | 2022-11-16 | 株式会社デンソー | Optical distance measuring device, method for manufacturing laser light emitting device |
WO2020121959A1 (en) * | 2018-12-14 | 2020-06-18 | 株式会社デンソー | Optical distance measurement device, laser light emission device, and method for manufacturing same |
DE102018222777A1 (en) * | 2018-12-21 | 2020-06-25 | Robert Bosch Gmbh | Optoelectronic sensor and method for operating an optoelectronic sensor |
JP2020106339A (en) * | 2018-12-26 | 2020-07-09 | ソニーセミコンダクタソリューションズ株式会社 | Measuring device and distance measuring device |
EP3899575A4 (en) * | 2019-01-31 | 2022-08-31 | Sense Photonics, Inc. | Strobe window dependent illumination for flash lidar |
US10955234B2 (en) * | 2019-02-11 | 2021-03-23 | Apple Inc. | Calibration of depth sensing using a sparse array of pulsed beams |
WO2020182591A1 (en) * | 2019-03-08 | 2020-09-17 | Osram Gmbh | Component for a lidar sensor system, lidar sensor system, lidar sensor device, method for a lidar sensor system and method for a lidar sensor device |
JP7337517B2 (en) * | 2019-03-14 | 2023-09-04 | 株式会社東芝 | Photodetector and distance measuring device |
US11796642B2 (en) | 2019-03-26 | 2023-10-24 | Infineon Technologies Ag | Oversamplng and transmitter shooting pattern for light detection and ranging (LIDAR) system |
EP3953727A4 (en) | 2019-04-09 | 2023-01-04 | Opsys Tech Ltd. | Solid-state lidar transmitter with laser control |
CN110109085B (en) * | 2019-04-15 | 2022-09-30 | 东南大学 | Low-power consumption wide-range array type photon timing reading circuit based on dual-mode switching |
US11320535B2 (en) | 2019-04-24 | 2022-05-03 | Analog Devices, Inc. | Optical system for determining interferer locus among two or more regions of a transmissive liquid crystal structure |
JP7259525B2 (en) * | 2019-04-26 | 2023-04-18 | 株式会社デンソー | Optical ranging device and method |
US11480685B2 (en) * | 2019-05-05 | 2022-10-25 | Apple Inc. | Compact optical packaging of LiDAR systems using diffractive structures behind angled interfaces |
EP3969938A4 (en) * | 2019-05-13 | 2023-05-17 | Ouster, Inc. | Synchronized image capturing for electronic scanning lidar systems |
CN110068808A (en) * | 2019-05-29 | 2019-07-30 | 南京芯视界微电子科技有限公司 | The receiver apparatus and laser radar of laser radar |
KR20220003600A (en) | 2019-05-30 | 2022-01-10 | 옵시스 테크 엘티디 | Eye-safe long-distance LIDAR system using actuators |
CN113924506A (en) | 2019-06-10 | 2022-01-11 | 欧普赛斯技术有限公司 | Eye-safe long-range solid-state LIDAR system |
TWI748460B (en) * | 2019-06-21 | 2021-12-01 | 大陸商廣州印芯半導體技術有限公司 | Time of flight device and time of flight method |
KR20220024177A (en) | 2019-06-25 | 2022-03-03 | 옵시스 테크 엘티디 | Adaptive multi-pulse LIDAR system |
JP2021015095A (en) * | 2019-07-16 | 2021-02-12 | パイオニア株式会社 | Distance measuring device |
DE102019211739A1 (en) * | 2019-08-06 | 2021-02-11 | Ibeo Automotive Systems GmbH | Lidar measuring system with two lidar measuring devices |
JP2021039069A (en) * | 2019-09-05 | 2021-03-11 | 株式会社東芝 | Photodetector, electronic device, and photodetection method |
JP2021043131A (en) * | 2019-09-13 | 2021-03-18 | ソニーセミコンダクタソリューションズ株式会社 | Distance measuring device and method for adjusting deviation of distance measuring mechanism in said device |
CN110596721B (en) * | 2019-09-19 | 2022-06-14 | 深圳奥锐达科技有限公司 | Flight time distance measuring system and method of double-shared TDC circuit |
CN110596724B (en) * | 2019-09-19 | 2022-07-29 | 深圳奥锐达科技有限公司 | Method and system for measuring flight time distance during dynamic histogram drawing |
CN110687541A (en) * | 2019-10-15 | 2020-01-14 | 深圳奥锐达科技有限公司 | Distance measuring system and method |
CN110780312B (en) * | 2019-10-15 | 2022-10-21 | 深圳奥锐达科技有限公司 | Adjustable distance measuring system and method |
JP2021071458A (en) * | 2019-11-01 | 2021-05-06 | ソニーセミコンダクタソリューションズ株式会社 | Light receiving device, ranging device, and light receiving circuit |
CN111090104B (en) * | 2019-12-26 | 2022-11-11 | 维沃移动通信有限公司 | Imaging processing method and electronic device |
CN113126104A (en) * | 2019-12-27 | 2021-07-16 | 精准基因生物科技股份有限公司 | Time-of-flight polarization light sensing system and light emitter thereof |
US20220357431A1 (en) | 2019-12-30 | 2022-11-10 | Lumus Ltd. | Detection and ranging systems employing optical waveguides |
JPWO2021161858A1 (en) * | 2020-02-14 | 2021-08-19 | ||
CN113359142A (en) * | 2020-03-06 | 2021-09-07 | 上海禾赛科技有限公司 | Laser radar and ranging method thereof |
JP7434002B2 (en) * | 2020-03-17 | 2024-02-20 | 株式会社東芝 | Photodetector and distance measuring device |
CN113447933A (en) * | 2020-03-24 | 2021-09-28 | 上海禾赛科技有限公司 | Detection unit of laser radar, laser radar and detection method thereof |
CN111352095A (en) * | 2020-04-17 | 2020-06-30 | 深圳市镭神智能系统有限公司 | Laser radar receiving system and laser radar |
CN111610534B (en) * | 2020-05-07 | 2022-12-02 | 广州立景创新科技有限公司 | Image forming apparatus and image forming method |
US11476372B1 (en) | 2020-05-13 | 2022-10-18 | Apple Inc. | SPAD-based photon detectors with multi-phase sampling TDCs |
CN113970757A (en) * | 2020-07-23 | 2022-01-25 | 华为技术有限公司 | Depth imaging method and depth imaging system |
CN114063043A (en) * | 2020-07-30 | 2022-02-18 | 北京一径科技有限公司 | Control method and device of photoelectric detection array, photoelectric power supply switching circuit and photoelectric detection array |
JP7476033B2 (en) * | 2020-08-24 | 2024-04-30 | 株式会社東芝 | Light receiving device and electronic device |
JP7434115B2 (en) | 2020-09-07 | 2024-02-20 | 株式会社東芝 | Photodetector and distance measuring device |
JP7423485B2 (en) * | 2020-09-18 | 2024-01-29 | 株式会社東芝 | distance measuring device |
CN112346075B (en) * | 2020-10-01 | 2023-04-11 | 奥比中光科技集团股份有限公司 | Collector and light spot position tracking method |
WO2022201501A1 (en) * | 2021-03-26 | 2022-09-29 | パイオニア株式会社 | Sensor device |
WO2022201502A1 (en) * | 2021-03-26 | 2022-09-29 | パイオニア株式会社 | Sensor device |
JP7443287B2 (en) | 2021-06-09 | 2024-03-05 | 株式会社東芝 | Photodetector and distance measuring device |
CN115980763A (en) * | 2021-10-15 | 2023-04-18 | 华为技术有限公司 | Detection method and device |
JP2023066231A (en) * | 2021-10-28 | 2023-05-15 | 株式会社デンソー | Control device, control method, and control program |
WO2023149242A1 (en) * | 2022-02-03 | 2023-08-10 | 株式会社小糸製作所 | Measurement device |
CN116184436B (en) * | 2023-03-07 | 2023-11-17 | 哈尔滨工业大学 | Array orbital angular momentum cloud penetration and fog penetration quantum detection imaging system |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10170637A (en) * | 1996-12-16 | 1998-06-26 | Omron Corp | Light scanner |
JPH1163920A (en) * | 1997-08-26 | 1999-03-05 | Matsushita Electric Works Ltd | Optically scanning system displacement measuring equipment |
CN1639539A (en) * | 2002-01-11 | 2005-07-13 | 通用医疗公司 | Apparatus for OCT imaging with axial line focus for improved resolution and depth of field |
CN201054040Y (en) * | 2007-05-21 | 2008-04-30 | 一品光学工业股份有限公司 | Micro electromechanical swinged laser scanning device |
CN101401107A (en) * | 2005-06-13 | 2009-04-01 | 数据逻辑扫描公司 | System and method for data reading using raster scanning |
CN101506999A (en) * | 2006-06-14 | 2009-08-12 | 厄利肯鲍泽斯涂层(英国)有限公司 | A method for laser scribing lines |
CN101825431A (en) * | 2009-03-05 | 2010-09-08 | 普莱姆森斯有限公司 | Reference image techniques for three-dimensional sensing |
EP2446301A1 (en) * | 2009-06-22 | 2012-05-02 | Toyota Motor Europe NV/SA | Pulsed light optical rangefinder |
CN102714706A (en) * | 2009-09-30 | 2012-10-03 | 苹果公司 | Display system having coherent and incoherent light sources |
DE202013101039U1 (en) * | 2013-03-11 | 2014-03-12 | Sick Ag | Optoelectronic sensor for distance measurement |
CN103703363A (en) * | 2011-04-15 | 2014-04-02 | 法罗技术股份有限公司 | Six degree-of-freedom laser tracker that cooperates with a remote sensor |
JP2014059301A (en) * | 2012-09-18 | 2014-04-03 | Sick Ag | Photoelectric sensor and depth map detection method |
CN103983979A (en) * | 2014-05-27 | 2014-08-13 | 中国科学院上海光学精密机械研究所 | Synthetic aperture laser imaging radar based on M sequence phase encoding and cross-polarization multiplexing |
CN105143820A (en) * | 2013-03-15 | 2015-12-09 | 苹果公司 | Depth scanning with multiple emitters |
CN105209869A (en) * | 2012-10-23 | 2015-12-30 | 苹果公司 | High accuracy imaging colorimeter by special designed pattern closed-loop calibration assisted by spectrograph |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02287113A (en) * | 1989-04-27 | 1990-11-27 | Asahi Optical Co Ltd | Distance measuring instrument |
JPH0567195A (en) * | 1991-09-05 | 1993-03-19 | Matsushita Electric Ind Co Ltd | Shape measuring instrument |
JP3832101B2 (en) * | 1998-08-05 | 2006-10-11 | 株式会社デンソー | Distance measuring device |
JP2007190566A (en) * | 2006-01-17 | 2007-08-02 | Miyachi Technos Corp | Fiber laser beam machining apparatus |
JP2011089874A (en) * | 2009-10-22 | 2011-05-06 | Toyota Central R&D Labs Inc | Distance image data acquisition device |
LU91688B1 (en) * | 2010-05-17 | 2011-11-18 | Iee Sarl | Scanning 3D imager |
WO2013028691A1 (en) * | 2011-08-25 | 2013-02-28 | Georgia Tech Research Corporation | Gas sensors and methods of preparation thereof |
JP2013113669A (en) * | 2011-11-28 | 2013-06-10 | Mitsubishi Electric Corp | Laser radar device |
KR102038533B1 (en) * | 2012-06-14 | 2019-10-31 | 한국전자통신연구원 | Laser Radar System and Method for Acquiring Target Image |
CN106165399B (en) * | 2014-04-07 | 2019-08-20 | 三星电子株式会社 | The imaging sensor of high-resolution, high frame per second, low-power |
-
2016
- 2016-12-08 EP EP16813340.3A patent/EP3391076A1/en not_active Ceased
- 2016-12-08 JP JP2018530709A patent/JP6644892B2/en active Active
- 2016-12-08 WO PCT/US2016/065472 patent/WO2017112416A1/en active Application Filing
- 2016-12-08 CN CN202010063812.6A patent/CN111239708B/en active Active
- 2016-12-08 CN CN201680074428.8A patent/CN108431626B/en active Active
-
2020
- 2020-01-08 JP JP2020001203A patent/JP6899005B2/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10170637A (en) * | 1996-12-16 | 1998-06-26 | Omron Corp | Light scanner |
JPH1163920A (en) * | 1997-08-26 | 1999-03-05 | Matsushita Electric Works Ltd | Optically scanning system displacement measuring equipment |
CN1639539A (en) * | 2002-01-11 | 2005-07-13 | 通用医疗公司 | Apparatus for OCT imaging with axial line focus for improved resolution and depth of field |
CN101401107A (en) * | 2005-06-13 | 2009-04-01 | 数据逻辑扫描公司 | System and method for data reading using raster scanning |
CN101506999A (en) * | 2006-06-14 | 2009-08-12 | 厄利肯鲍泽斯涂层(英国)有限公司 | A method for laser scribing lines |
CN201054040Y (en) * | 2007-05-21 | 2008-04-30 | 一品光学工业股份有限公司 | Micro electromechanical swinged laser scanning device |
CN101825431A (en) * | 2009-03-05 | 2010-09-08 | 普莱姆森斯有限公司 | Reference image techniques for three-dimensional sensing |
EP2446301A1 (en) * | 2009-06-22 | 2012-05-02 | Toyota Motor Europe NV/SA | Pulsed light optical rangefinder |
CN102714706A (en) * | 2009-09-30 | 2012-10-03 | 苹果公司 | Display system having coherent and incoherent light sources |
CN103703363A (en) * | 2011-04-15 | 2014-04-02 | 法罗技术股份有限公司 | Six degree-of-freedom laser tracker that cooperates with a remote sensor |
JP2014059301A (en) * | 2012-09-18 | 2014-04-03 | Sick Ag | Photoelectric sensor and depth map detection method |
CN105209869A (en) * | 2012-10-23 | 2015-12-30 | 苹果公司 | High accuracy imaging colorimeter by special designed pattern closed-loop calibration assisted by spectrograph |
DE202013101039U1 (en) * | 2013-03-11 | 2014-03-12 | Sick Ag | Optoelectronic sensor for distance measurement |
CN105143820A (en) * | 2013-03-15 | 2015-12-09 | 苹果公司 | Depth scanning with multiple emitters |
CN103983979A (en) * | 2014-05-27 | 2014-08-13 | 中国科学院上海光学精密机械研究所 | Synthetic aperture laser imaging radar based on M sequence phase encoding and cross-polarization multiplexing |
Non-Patent Citations (2)
Title |
---|
Cristiano Niclass.Design and characterization of a 256x64-pixel single-photon imager in CMOS for a MEMS based laser scanning time-of-flight sensor.OPTICS EXPRESS.2012,第1-19页. * |
朱建.红外图像超分辨率重建的仿真研究.中国优秀硕士学位论文全文数据库 信息科学辑.2005,第I135-17页. * |
Also Published As
Publication number | Publication date |
---|---|
CN108431626B (en) | 2022-06-17 |
JP6644892B2 (en) | 2020-02-12 |
CN111239708A (en) | 2020-06-05 |
JP6899005B2 (en) | 2021-07-07 |
EP3391076A1 (en) | 2018-10-24 |
CN108431626A (en) | 2018-08-21 |
JP2020073901A (en) | 2020-05-14 |
WO2017112416A1 (en) | 2017-06-29 |
JP2018537680A (en) | 2018-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111239708B (en) | Light detection and ranging sensor | |
US10795001B2 (en) | Imaging system with synchronized scan and sensing | |
US10324171B2 (en) | Light detection and ranging sensor | |
US11681027B2 (en) | Time-of-flight depth mapping with parallax compensation | |
EP3722832B1 (en) | Laser radar system | |
CN111727381B (en) | Multi-pulse LiDAR system for multi-dimensional sensing of objects | |
EP3704510B1 (en) | Time-of-flight sensing using an addressable array of emitters | |
US10908266B2 (en) | Time of flight distance sensor | |
KR102409952B1 (en) | High resolution, high frame rate, low power image sensor | |
JP4405154B2 (en) | Imaging system and method for acquiring an image of an object | |
US20190310370A1 (en) | Optoelectronic sensor and method for detection and distance determination of objects | |
JP2019215324A (en) | Photoelectric sensor and distance measurement method | |
US20190094364A1 (en) | Waveform design for a LiDAR system with closely-spaced pulses | |
EP3602110B1 (en) | Time of flight distance measurement system and method | |
Ruokamo et al. | An $80\times25 $ Pixel CMOS Single-Photon Sensor With Flexible On-Chip Time Gating of 40 Subarrays for Solid-State 3-D Range Imaging | |
CN112912765B (en) | Lidar sensor for optically detecting a field of view, working device or vehicle having a Lidar sensor, and method for optically detecting a field of view | |
KR20200033068A (en) | Lidar system | |
US20190349569A1 (en) | High-sensitivity low-power camera system for 3d structured light application | |
Kotake et al. | Performance improvement of real-time 3D imaging ladar based on a modified array receiver |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |