EP3391076A1 - Light detection and ranging sensor - Google Patents

Light detection and ranging sensor

Info

Publication number
EP3391076A1
EP3391076A1 EP16813340.3A EP16813340A EP3391076A1 EP 3391076 A1 EP3391076 A1 EP 3391076A1 EP 16813340 A EP16813340 A EP 16813340A EP 3391076 A1 EP3391076 A1 EP 3391076A1
Authority
EP
European Patent Office
Prior art keywords
scan
target scene
array
sensing elements
beams
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP16813340.3A
Other languages
German (de)
English (en)
French (fr)
Inventor
Cristiano L. NICLASS
Alexander Shpunt
Gennadiy A. Agranov
Matthew C. WALDON
Mina A. Rezk
Thierry Oggier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/975,790 external-priority patent/US9997551B2/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of EP3391076A1 publication Critical patent/EP3391076A1/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

Definitions

  • the present invention relates generally to opto-electronic devices, and particularly to light detection and ranging (LiDAR) sensors.
  • LiDAR light detection and ranging
  • Imaging devices also commonly known as light detection and ranging (LiDAR) sensors
  • LiDAR light detection and ranging
  • a commonly used technique to determine the distance to each point on the target scene involves sending an optical beam towards the target scene, followed by the measurement of the round-trip time, i.e. time-of- flight (ToF), taken by the optical beam as it travels from the source to target scene and back to a detector adjacent to the source.
  • TOF time-of- flight
  • a suitable detector for ToF-based LiDAR is provided by a single -photon avalanche diode
  • SPAD Geiger-mode avalanche photodiodes
  • GPDs Geiger-mode avalanche photodiodes
  • Arrays of SPAD sensors, fabricated on a single chip, have been used experimentally in 3D imaging cameras. Charbon et al. provide a useful review of SPAD technologies in "SPAD-Based Sensors,” published in TOF Range-Imaging Cameras (Springer- Verlag, 2013), which is incorporated herein by reference.
  • a p-n junction is reverse-biased at a level well above the breakdown voltage of the junction.
  • the electric field is so high that a single charge carrier injected into the depletion layer, due to an incident photon, can trigger a self-sustaining avalanche.
  • the leading edge of the avalanche current pulse marks the arrival time of the detected photon.
  • the current continues until the avalanche is quenched by lowering the bias voltage down to or below the breakdown voltage.
  • This latter function is performed by a quenching circuit, which may simply comprise a high-resistance ballast load in series with the SPAD, or may alternatively comprise active circuit elements.
  • Embodiments of the present invention that are described hereinbelow provide improved LiDAR sensors and methods of their use.
  • an electro-optical device which includes a laser light source, which is configured to emit at least one beam of light pulses, a beam steering device configured to transmit and scan the at least one beam across a target scene, and an array of sensing elements. Each sensing element is configured to output a signal indicative of a time of incidence of a single photon on the sensing element.
  • Light collection optics are configured to image the target scene scanned by the transmitted beam onto the array.
  • Circuitry is coupled to actuate the sensing elements only in a selected region of the array and to sweep the selected region over the array in synchronization with scanning of the at least one beam.
  • the circuitry is configured to select the region such that at any instant during the scan, the selected region contains a part of the array onto which the light collection optics image an area of the target scene that is illuminated by the at least one beam.
  • the selected region may include one sensing element or multiple sensing elements.
  • the circuitry is configured to process signals output by the sensing elements in order to determine respective distances to points in the target scene.
  • the sensing elements include single-photon detectors, such as single-photon avalanche diodes (SPADs).
  • SPADs single-photon avalanche diodes
  • the laser light source is configured to emit at least two beams along different, respective beam axes, such that at any instant during the scan, the light collection optics image respective areas of the target scene that are illuminated by the at least two beams onto different, respective ones of the sensing elements.
  • the beam steering device is configured to scan the at least two beams across the target scene in a two- dimensional scan
  • the circuitry is configured to sweep the selected region over the array in a two-dimensional pattern corresponding to the two-dimensional scan.
  • the two- dimensional scan may form a raster pattern, wherein the respective beam axes of the at least two beams are mutually offset transversely relative to a scan line direction of the raster pattern.
  • the beam steering device is configured to scan the at least two beams across the target scene in a linear scan in a first direction, and the at least two beams include multiple beams arranged along a column axis in a second direction, perpendicular to the first direction.
  • the multiple beams are arranged in at least two columns, having respective column axes that are orthogonal to the first direction of the scan and are mutually offset.
  • a method for sensing which includes emitting at least one beam of light pulses and transmitting and scanning the at least one beam across a target scene.
  • An array of sensing elements is provided, each sensing element configured to output a signal indicative of a time of incidence of a single photon on the sensing element.
  • the target scene scanned by the transmitted beam is imaged onto the array.
  • the sensing elements are actuated only in a selected region of the array, and the selected region is swept over the array in synchronization with scanning of the at least one beam.
  • Fig. 1 is a schematic illustration of a LiDAR system, in accordance with an embodiment of the invention
  • Fig. 2 is a block diagram that schematically illustrates a SPAD-based sensing device, in accordance with an embodiment of the invention
  • Fig. 3 is a block diagram showing components of a sensing element in a SPAD array, in accordance with an embodiment of the invention
  • Fig. 4 is a block diagram that schematically illustrates a SPAD array with a scanned region of sensitivity, in accordance with an embodiment of the invention
  • Fig. 5 is a schematic illustration of a detector array with a circular scanned illumination spot, in accordance with an embodiment of the invention
  • Fig. 6 is a schematic illustration of a detector array with a circular scanned illumination spot, in accordance with another embodiment of the invention.
  • Figs. 7A-C are a schematic illustrations of a detector array with an elliptical scanned illumination spot, in accordance with yet another embodiment of the invention.
  • Fig. 8 is a schematic illustration of a detector array with two circular illumination spots scanned in a two-dimensional raster scan, in accordance with an embodiment of the invention
  • Fig. 9 is a schematic illustration of a detector array with a staggered array of illumination spots scanned in a one-dimensional scan, in accordance with an embodiment of the invention.
  • Fig. 10 is a schematic illustration of a LiDAR device implementing a one-dimensional scan, in accordance with an embodiment of the invention
  • Fig. 11 is a schematic illustration of a LiDAR device implementing a one-dimensional scan, in accordance with another embodiment of the invention
  • Fig. 12 is a schematic illustration of a LiDAR device using a laser light source with adjustable emissive power, in accordance with an embodiment of the invention.
  • Fig. 13 is a schematic illustration of a LiDAR device using two laser light sources with different emissive powers, in accordance with an embodiment of the invention.
  • the quality of the measurement of the distance to each point in a target scene (target scene depth) using a LiDAR is often compromised in practical implementations by a number of environmental, fundamental, and manufacturing challenges.
  • An example of environmental challenges is the presence of uncorrected background light, such as solar ambient light, in both indoor and outdoor applications, typically reaching an irradiance of 1000W/m 2 .
  • Fundamental challenges are related to losses incurred by optical signals upon reflection from the target scene surfaces, especially due to low-reflectivity target scenes and limited optical collection aperture, as well as electronic and photon shot noises.
  • the embodiments of the present invention that are described herein address the above limitations so as to enable compact, low-cost LiDARs achieving accurate high-resolution depth imaging that can operate in uncontrolled environments.
  • the disclosed embodiments use one or more pulsed laser sources emitting beams to generate high-irradiance illumination spots at the intersections of the axes of the emitted beams with the target scene.
  • the beams and hence the illumination spots are scanned across the target scene.
  • the illumination reflected from the target scene is imaged by collection optics onto and detected by a time-of-flight, single-photon detector array for high signal-to-noise ratio, with the distance to each point of the target scene derived from the time-of-flight data.
  • Imaging of the target scene onto the detector array generates a one-to-one correspondence between locations in the target scene and locations on the detector array, defined by geometrical optics, as is known in the art.
  • an area of the target scene is imaged onto a corresponding image area on the detector, with a linear length in the image given by multiplying the corresponding length in the target scene area by the optical magnification M, wherein for LiDAR systems typically M «l.
  • a sensing element of the detector array can be thought of as imaged back onto the target scene with magnification 1/M, giving the location and area of the target scene that is "seen" by the sensing element.
  • the detector array comprises a two-dimensional array of single-photon time- sensitive sensing elements, such as single-photon avalanche diodes (SPADs).
  • SPADs single-photon avalanche diodes
  • the sensitivity, including the on/off-state, of each SPAD is controlled by its specific reverse p-n junction high voltage.
  • the SPADs work as individual sensing elements, whereas in other embodiments several SPADs are grouped together into super pixels.
  • only the sensing elements in the area or areas of the array that are to receive reflected illumination from a scanned beam are actuated. The sensing elements are thus actuated only when their signals provide useful information. This approach both reduces the background signal, which would lower the signal-to-background ratio, and lowers the electrical power needs of the detector array.
  • a LiDAR measures the distance to the target scene for a set of discrete points with a finite averaging area associated with each point.
  • the parameters of the measurement, as well as the actuation of sensing elements, are affected by the following system parameters of the LiDAR:
  • the size of the super pixels of the detector array or in other words, the number of sensing elements that are binned together in the ToF measurement (including the case in which one sensing element is used as a super pixel).
  • the target scene is illuminated and scanned by either one laser beam or by multiple beams.
  • these beams are generated by splitting a laser beam using diffractive optical elements, prisms, beamsplitters, or other optical elements that are known in the art.
  • multiple beams are generated using several discrete laser light sources.
  • the multiple beams are generated using a monolithic laser array, such as an array of VCSELs or VECSELs.
  • a beam steering device such as a scanning mirror, is operated to scan the target scene with a single light beam in a two-dimensional raster scan.
  • a raster scan generally comprises long, approximately straight back-and-forth scans, so-called scan lines, along with short movements transferring the scan point from one scan line to the next).
  • a raster pattern is described here by way of example, and alternative scan patterns implementing similar principles are considered to be within the scope of the present invention.
  • the scan resolution in the direction perpendicular to the scan lines of the raster scan is given by the separation between successive scan lines.
  • the scan resolution can be increased by decreasing the separation between successive scan lines, but this sort of resolution increase will come at the expense of reduced frame rate, since a larger number of scan lines is required to cover the scene.
  • the resolution may be increased at the expense of reduced field of view if the number of scan lines per frame is unchanged. Mechanical constraints put a limit on the degree to which the scanning speed of the mirror can be increased in order to offset these effects.
  • the scan resolution in the direction perpendicular to the scan lines is increased by using multiple light beams, spread transversely relative to the scan line direction as well as in the scan line direction.
  • the separation of the light beams along the scan line is configured so that each light beam illuminates a separate super pixel on the detector array, in order to identify individually each light beam.
  • the transverse separation of the light beams, rather than the scan line density, now determines the scan resolution.
  • the disclosed embodiment achieves an increase in the lateral resolution without reducing the size of the sensing elements, thus mitigating the miniaturization requirements for the detector array.
  • multiple illumination spots are scanned across the target scene in a linear scan.
  • a linear scan in this context includes scans along a single direction in which the scan line is distorted from a straight line due to optical or mechanical imperfections.
  • Employing a one-dimensional, linear scan permits the use of a simpler and cheaper beam steering device than for a two-dimensional scan, but the number of light beams to cover the target scene with sufficiently high resolution is generally higher than that required for a two-dimensional scan.
  • a single-column scan can be implemented with multiple light beams configured in a column perpendicular to the scan line, generating one column of illumination spots. The highest scan resolution in the direction of the axis of the column is attained when each illumination spot is imaged onto a separate sensing element in the detector array.
  • the scan resolution perpendicular to the scan line is increased by generating multiple columns of illumination spots, perpendicular to the scan line and mutually offset in the direction of the axes of the columns.
  • the multiple columns are also mutually offset in the direction of the scan line by at least one sensing element, so as to have each illumination spot illuminate a separate sensing element, and thus permit each illumination spot to be separately identified.
  • Some embodiments of the present invention provide LiDAR systems with a wide angular field-of-view (FoV), covering a large depth range.
  • FoV field-of-view
  • these embodiments apply dedicated designs and use-modes of laser light sources, detector arrays, electronics, and algorithms to measure scene depths over a wide range of FoVs and distances, while keeping the optical design and construction simple.
  • the considerations for the laser light source relate to its emissive power: If one were to use only low emission-power laser light sources for target scene scanning, the signal received by the detector array from distant points of the target scene would be too weak for a robust and accurate measurement. If, on the other hand, one were to use only high emission-power laser light sources, capable of measuring distant target scene points, unnecessarily high emissive power would be used by the LiDAR for nearby target scene points, increasing the electrical power consumption of the LiDAR. Therefore, in some embodiments of the invention, the laser light source emissive power is adjusted according to the measured distance.
  • Fig. 1 shows schematically a LiDAR system 18, in accordance with an embodiment of the invention.
  • the beam or beams from a laser light source 20, comprising one or more pulsed lasers, are directed to a target scene 22 by a dual-axis beam-steering device 24, forming and scanning illumination spots 26 over the target scene.
  • the term "light” is used herein to refer to any sort of optical radiation, including radiation in the visible, infrared, and ultraviolet ranges.
  • Beam-steering devices can comprise, for example, a scanning mirror, or any other suitable type of optical deflector or scanner that is known in the art.
  • Illumination spots 26 are imaged by collection optics 27 onto a two-dimensional detector array 28, comprising single-photon, time- sensitive sensing elements, such as SPADs.
  • Target scene 22 is also illuminated, besides illumination spots 26, by an ambient light source 36, such as the sun.
  • an ambient light source 36 such as the sun.
  • the irradiance of the illumination spots is chosen to be much higher than that of the ambient illumination, which can reach up to 1000W/m 2 due to irradiance from the sun, for example.
  • a band-pass filter 37 is used for further reduction of ambient illumination on detector array 28.
  • a control circuit 38 is connected to laser light source 20, timing the pulse emissions and controlling their emissive power, and to dual-axis beam-steering device 24, controlling the scan of illumination spots 26.
  • control circuit 38 adjusts dynamically the reverse p-n junction high voltage of each SPAD of detector array 28, thus controlling the actuation and sensitivity of each SPAD.
  • control circuit 38 actuates only those SPADs onto which, at any given moment, the illumination spots are imaged by collection optics 27. Utilizing further the above knowledge of laser light source 20 and beam steering device 24, as well as the signals read from detector array 28, control circuit 38 determines the distance to each scanned point in target scene 22 using the time-of-flight measured from the laser light source to the detector array.
  • Figs. 2-4 illustrate schematically the architecture and functioning of detector array 28, in accordance with embodiments of the invention. These figures show one possible scheme that can be used for selectively actuating the SPAD-based sensing elements in the array, using a combination of global and local bias controls. Alternatively, other sorts of biasing and actuation schemes, as well as other sorts of single-photon sensing elements, may be used for these purposes.
  • Fig. 2 is a block diagram that schematically illustrates detector array 28, in accordance with an embodiment of the invention.
  • Detector array 28 comprises sensing elements 44, each comprising a SPAD and associated biasing and processing circuits, as described further hereinbelow.
  • a global high- voltage bias generator 46 applies a global bias voltage to all of sensing elements 44 in array 28.
  • a local biasing circuit 48 in each sensing element 44 applies an excess bias, which sums with the global bias in the sensing element.
  • a sensing element bias control circuit 50 sets the excess bias voltages applied by local biasing circuits 48 to different, respective values in different sensing elements.
  • Both global high- voltage bias generator 46 and sensing element bias control circuit 50 are connected to control circuit 38 (Fig. 1).
  • Fig. 3 is a block diagram showing components of one of sensing elements 44 in array 28, in accordance with an embodiment of the invention.
  • array 28 comprises a two-dimensional matrix of the sensing elements formed on a first semiconductor chip 52, with a second two-dimensional matrix of bias control and processing circuits formed on a second semiconductor chip 54. (Only a single element of each of the two matrices is shown.) Chips 52 and 54 are coupled together so that the two matrices are in a one-to-one correspondence, whereby each sensing element on chip 52 is in contact with the corresponding bias control and processing elements on chip 54.
  • Both of chips 52 and 54 may be produced from silicon wafers using well-known CMOS fabrication processes, based on SPAD sensor designs that are known in the art, along with accompanying bias control and processing circuits as described herein.
  • the designs and principles of detection that are described herein may be implemented, mutatis mutandis, using other materials and processes.
  • all of the components shown in Fig. 3 may be formed on a single chip, or the distribution of the components between the chips may be different. All such alternative implementations are considered to be within the scope of the present invention.
  • Sensing element 44 comprises a SPAD 56, comprising a photosensitive p-n junction, as is known in the art.
  • Peripheral circuits including a quenching circuit 58 and local biasing circuit 48, are typically located on chip 54.
  • the actual bias applied to SPAD 56 is a sum of the global bias voltage Vbias provided by bias generator 46 (Fig. 2) and an excess bias applied by biasing circuit 48.
  • Sensing element bias control circuit 50 (Fig. 2) sets the excess bias to be applied in each sensing element by setting a corresponding digital value in a bias memory 60 on chip 54.
  • SPAD 56 In response to each captured photon, SPAD 56 outputs an avalanche pulse, which is received by processing circuits on chip 54, including digital logic 62 and a memory configured as an output buffer 64.
  • processing elements can be configured, for example, to function as a time-to-digital converter (TDC), which measures the delay of each pulse output by SPAD 56 relative to a reference time and outputs a digital data value corresponding to the delay.
  • TDC time-to-digital converter
  • logic 62 and buffer 64 may measure and output other sorts of values, including (but not limited to) a histogram of pulse delay times, a binary waveform, or a multi- level digital waveform.
  • the outputs from chip 54 are connected to control circuit 38 (Fig. 1).
  • Fig. 4 is a block diagram that schematically illustrates SPAD array 28 with a scanned region 70 of sensitivity, in accordance with an embodiment of the invention.
  • bias control circuit 50 sets the bias voltages of sensing elements 72 within region 70 to higher values than the remaining sensing elements 76, wherein the bias voltage is set so that sensing elements 76 are turned off.
  • Bias control circuit 50 modifies the bias voltages of sensing elements 48 dynamically, however, so as to sweep region 70 across the array, as illustrated by the arrow in the figure.
  • Circuit 50 may, for example, sweep region 70 in a raster scan, in synchronization with the scanning of a laser beam across a target scene being imaged onto array 28 (as is illustrated in the figures that follow).
  • this embodiment is useful, inter alia, in tailoring the sensitive region of array 28 to the shape of an illuminating light beam or of an area of interest in a target scene being imaged, thus maximizing sensitivity of array 28 relative to power consumption, while reducing background noise from sensing elements that will not contribute to the signal.
  • bias control circuit 50 sets the local bias voltages so that region 70 has a linear shape, extending along one or more columns of array 28 and matching the linear shape of an illumination beam or array of beams. Circuit 50 may then sweep this linear region 70 across array 28 in synchronization with the illumination beam.
  • other scan patterns may be implemented, including both regular and adaptive scan patterns.
  • Fig. 5 is a schematic illustration showing detector array 28 with the image of a circular scanned illumination spot 26 (Fig. 1) superimposed on the array, in accordance with an embodiment of the invention.
  • the images of scanned illumination spot 26 for these three consecutive points in time are denoted by circles 84, 86, and 88, respectively, with their diameters, in this example, twice the pitch of sensing elements 44.
  • An arrow 90 indicates the direction of the scan of the image of scanned illumination spot 26, with the expected position of the image of the scanned illumination spot determined from the knowledge of the state of beam steering device 24.
  • sensing elements 44 in a region of array 28 that best matches the position of the image of illumination spot 26 at that point of time are actuated.
  • These actuated sensing elements can be regarded as a sort of "super pixel.”
  • each super pixel comprises an array of 2x2 sensing elements, but in some embodiments the size of the super pixel takes other values either statically or dynamically.
  • each sensing element 44 is associated with two neighboring super pixels. Only those sensing elements within the active super pixel are actuated at a given moment, with the rest of the sensing elements turned off by lowering their bias voltage to a level at which avalanche multiplication is not sustainable.
  • This operation maximizes the collection of the optical signal from the image of scanned illumination spot 26, while reducing the exposure to target scene background illumination uncorrected to the illumination spot, thus increasing the signal-to-background ratio of array 28.
  • the output of the sensing elements that are not illuminated by the image of scanning spot 26 are masked out using standard logic gates.
  • the lateral resolution of target scene 22 in the direction of the scan is determined by the discrete step size of the scan (as determined by the scan speed and laser pulse repetition rate), which in this embodiment is one pitch of sensing elements 44.
  • the area over which the target scene distance is averaged is (approximately) the area of a super pixel.
  • Fig. 6 is a schematic illustration showing detector array 28 with the image of a circular scanned illumination spot 26 (Fig. 1) superimposed on the array, in accordance with another embodiment of the invention.
  • Both the diameter of the image of the scanned illumination spot and the scanning step between two consecutive points in time are half of the pitch of sensing elements 44.
  • the images of scanned illumination spot 26 for the three consecutive points in time are denoted by circles 100, 102, and 104, respectively.
  • An arrow 105 indicates the direction of the scan, with the expected position of the image determined from the knowledge of the state of beam steering device 24.
  • the lateral resolution of the target scene 22 image in the direction of the scan is half of the pitch of sensing elements 44, and the area of target scene over which the distance is averaged is the area of illumination spot 26.
  • Figs. 7A-C are schematic illustrations showing detector array 28 with the image of an elliptical scanned illumination spot 26 (Fig. 1) superimposed on the array, in accordance with yet another embodiment of the invention.
  • An elliptical illumination spot is obtained, for instance, from an edge-emitting laser diode in which the emitting junction cross-section is a rectangle with a high aspect ratio.
  • an elliptical illumination spot 26 with an aspect ratio of 3-to-l is illustrated, although other aspect ratios can be used in other embodiments.
  • Each scan step on detector array 28 is one pitch of sensing elements 44. In this embodiment, super pixels of 2x2 sensing elements are used.
  • An arrow 118 indicates the direction of the scan, with the expected position of illumination spot 110 determined from the knowledge of the state of beam steering device 24.
  • the super pixels actuated at this time are 112, 114, 116, and 122. Now four super pixels are actuated, since a significant portion of illumination spot 120 (top of ellipse) is still within pixel 112, and another significant portion (bottom of ellipse) has entered pixel 122.
  • Super pixels 112, 114, and 116 continue collecting the signal so as to improve the signal-to-noise ratio.
  • the super pixels actuated at this time, based on the expected position of illumination spot 124, are now 114, 116, and 122. Now only three super pixels are actuated, as pixel 112 (Fig. 7B) is no longer illuminated by any significant portion of illumination spot 124.
  • each super pixel will be exposed to the image of illumination spot 26 for seven scan steps, thus improving the signal-to-noise ratio.
  • the resolution in the direction of the scan is determined by the super pixel size.
  • the super pixel size is a third of the length of the elliptical illumination spot along its fast (long) axis
  • the resolution obtained in the direction of the scan line is three times as good (a third in numerical value) as that obtained with the elliptical illumination spot alone.
  • the averaging area for the distance measurement is the area of a super pixel.
  • control circuit 38 calculates (or looks up) the actual shape of the illumination spot image on the detector array, and the results of this calculation are used in choosing the sensor elements to be activated at each point in the scan.
  • the calculation takes into account the effects of the design of beam steering device 24, its scanning movement characteristics, the exact state of the beam steering device, and the angle between the beam from laser light source 20 and the beam steering device, as they impact on the shape, direction of movement, and orientation of the image of illumination spot 26.
  • the dependence of the image on the distance between the LiDAR device and target scene 22 is taken into account.
  • Fig. 8 is a schematic illustration showing a technique for enhancement of the resolution of a raster-scanning LiDAR, in accordance with an embodiment of the invention.
  • Beam steering device 24 scans the images of illumination spots 26 (Fig. 1) on detector array 28 in a raster scan pattern 130 down one column and up the next column of the detector array. If only one illumination spot were to be used, the lateral resolution perpendicular to the scan lines of the raster scan would be the pitch of sensing elements 44. In the present embodiment, however, the lateral resolution is doubled by using two scanned illumination spots 26, whose images on detector array 28 are separated along the scan line by a distance equal to the pitch of sensing elements 44, and transversely to the scan line by half of this pitch. Beam steering device 24 and the repetition rate of laser light source 20 are configured so that successive illumination spots are separated by steps of half the pitch of sensing elements 44 in the direction of the scan line of the raster scan. Each super pixel comprises one sensing element 44.
  • the images of the illumination spots are a spot 132 and a spot 134, with spot 132 inside a super pixel 136, and spot 134 inside a super pixel 138. All other super pixels are turned off.
  • both spots have moved down, as indicated by arrows 140, by half a super pixel, to new positions 142 and 144.
  • the averaging area of distance measured by each of illumination spots 26 is the area of that illumination spot.
  • the number of scanned illumination spots 26 is increased to more than two (as compared to Fig. 8), with the illumination spots separated along raster scan pattern 130 so that the image of each illumination spot is located in a different sensing element 44.
  • the resolution transversely to raster scan 130 is given by dividing the pitch of sensing elements 44 by N.
  • Figs. 9-11 are schematic illustrations showing a LiDAR based on a linear scan, in accordance with an embodiment of the invention.
  • a linear (one-dimensional) scan has the advantage that it utilizes a potentially smaller, cheaper, and more reliable design of the beam steering device than that required for a two-dimensional scan.
  • the resolution in the direction of the linear scan is determined by the resolution of the beam steering device. As no scan takes place transversely to the direction of the linear scan, resolution in that direction is accomplished by using multiple illumination spots 26 arrayed across target scene 22.
  • Fig. 9 is a schematic illustration showing a one-dimensional scan as imaged onto detector array 28, in accordance with an embodiment of the invention.
  • the resolution of the LiDAR in the direction perpendicular to the linear scan is improved beyond the pitch of sensing elements 44 by using a pattern 150 of images of illumination spots 26 comprising two staggered columns 151 and 152, with circles 153 denoting the expected positions of the images of the individual illumination spots on sensor array 28.
  • Arrows 154 indicate the direction of the scan.
  • each column 151 and 152 of pattern 150 the spacing of the images of illumination spots 26, as indicated by circles 153, along the axis of the respective column is equal to the pitch of sensing elements 44.
  • the two columns 151 and 152 are mutually offset by half of the pitch of sensing elements 44 in the direction of the axes of the columns.
  • Columns 151 and 152 are spaced in the direction of the scan by one pitch in order to assign the two columns to separate sensing elements.
  • the resolution transverse to the linear scan is further improved by using more than two columns of illumination spots 26 with smaller mutual offsets in the direction of the axes of the columns.
  • a resolution of one quarter pitch is achieved.
  • Fig. 10 is a schematic illustration showing a LiDAR 159 based on a one-dimensional scan, in accordance with an embodiment of the invention.
  • the beam from a single pulsed laser source 160 is split by a diffractive optical element (DOE) 162 into two staggered columns of multiple beams. These beams are directed to and scanned over target scene 22 by a single-axis beam- steering device 166, forming two staggered columns of illumination spots 168 on target scene 22.
  • the illumination spots are imaged by collection optics 27 onto detector array 28, forming two staggered columns 151 and 152 in pattern 150 as shown in Fig. 9.
  • DOE diffractive optical element
  • control circuit 38 is connected to laser light source 160, beam steering device 166, and detector array 28, controlling their functions and collecting data to determine the distance to target scene 22 by using time-of-flight data.
  • Fig. 11 is a schematic illustration showing a LiDAR 170 based on a one-dimensional scan and a co-axial optical architecture, in accordance with another embodiment of the invention.
  • the beam from a single pulsed laser source 160 is split by DOE 162 into two staggered columns of multiple beams. These beams pass through a polarizing beamsplitter 176, and are directed to and scanned over target scene 22 by single-axis beam-steering device 166, thus forming two staggered columns of illumination spots 168.
  • the illumination spots, reflected from target scene 22, are imaged through beam steering device 166, polarizing beamsplitter 176, and collection optics 27 onto detector array 28, forming two staggered columns 151 and 152 in pattern 150 as shown in Fig. 9.
  • control circuit 38 is connected to laser light source 160, beam steering device 166, and detector array 28, controlling their functions and collecting data to determine the distance to target scene 22 using time-of-flight data.
  • the lateral resolution perpendicular to the scan direction is half of the pitch of sensing elements 44, and the resolution along the scan is determined by the scan rate of beam steering device 166 and the pulse repetition rate of laser source 160.
  • Each one of illumination spots 168 averages the distance measurement over the area of that spot.
  • Figs. 12-13 are schematic illustrations showing LiDARs that adapt themselves to near and far distances of the target scene, in accordance to embodiments of the invention.
  • Fig. 12 is a schematic illustration showing a LiDAR 199, which adapts itself for measuring distances to both near and far target scene points, in accordance with an embodiment of the invention.
  • the beam of a pulsed laser light source 200 is directed to target scene 22 by dual-axis beam-steering device 24, forming an illumination spot 206 on and scanning the spot over the target scene.
  • Illumination spot 206 is imaged onto detector array 28 by collection optics 27.
  • Control circuit 38 is connected to laser light source 200, beam steering device 24, and detector array 28.
  • Laser light source 200 has the capability to emit light at two power levels: low emissive power and high emissive power, under control of signals from control circuit 38.
  • sensing elements 44 of detector array 28 (see Fig. 2) have the capability to operate in two distinct modes: short-range mode and long-range mode.
  • control circuit 38 will adjust its timing and sensitivity, as well as the signal processing algorithms for optimal performance in that mode.
  • sensing elements 44 are biased for relatively lower sensitivity (which also results in lower noise) and gated to sense short times of flight.
  • sensing elements 44 are biased for relatively higher sensitivity and gated to sense longer times of flight, thus reducing the likelihood of spurious detection of short-range reflections.
  • the area is first scanned using laser light source 200 at its low emissive power level, suitable for short-range detection.
  • the sensing elements 44 in detector array 28 receiving the light originating from laser light source 200 are actuated with their timing, sensitivity, and associated signal processing algorithms set for short-range distance measurement.
  • control circuitry 38 controls LiDAR 199 to perform a long-range scan only in the areas in which, based on predetermined criteria, the short-range, low- power scan did not yield a sufficiently robust distance measurement.
  • the measurement for these areas is repeated using the high emissive power level of light source 200, with appropriate changes in the timing, sensitivity, and algorithms of sensing elements 44 that are actuated to receive the reflected light from these areas.
  • Fig. 13 is a schematic illustration showing a LiDAR 210, which adapts itself for measuring distances to both near and far target scene points, in accordance with another embodiment of the invention.
  • the beams of two pulsed laser light sources 218 and 220 are directed to target scene 22 by dual-axis beam-steering device 24, forming an illumination spot 226 on and scanning it over target scene 22.
  • the separation between laser light sources 218 and 220 is exaggerated in Fig. 13 in order to show the two separate sources).
  • Illumination spot 226 is imaged onto detector array 28 by collection optics 27.
  • Control circuit 38 is connected to laser light sources 218 and 220, beam steering device 24, and detector array 28.
  • Each laser light source 218, 220 when actuated, emits at a specific emissive power level, with laser light source 218 emitting at a low emissive power level, and laser light source 220 emitting at a high emissive power level.
  • Control circuit 38 chooses which of the laser light sources to actuate at each point in the scan based on the sorts of criteria explained above with reference to Fig. 12.
  • sensing elements 44 of detector array 28 (see Fig. 2) have the capability to operate in two distinct modes: short-range mode and long-range mode. For a given mode of operation of a specific sensing element 44, control circuit 38 will adjust its timing and sensitivity as well as its signal processing algorithms for optimal performance in that mode.
  • the area is first scanned using low emissive power laser light source 218.
  • Those sensing elements 44 in detector array 28 that receive the light originating from laser light source 218 are actuated, with their timing, sensitivity, and associated signal processing algorithms set for short-range distance measurement.
  • control circuit 38 determines that a sufficiently robust distance measurement cannot be made for a given area using laser light source 218, the measurement for that area is repeated at higher emissive power laser using light source 220, with appropriate changes in the timing, sensitivity, and algorithms of those of sensing elements 44 that are actuated for receiving the light from laser light source 220.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
EP16813340.3A 2015-12-20 2016-12-08 Light detection and ranging sensor Ceased EP3391076A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14/975,790 US9997551B2 (en) 2015-12-20 2015-12-20 Spad array with pixel-level bias control
US201662353588P 2016-06-23 2016-06-23
PCT/US2016/065472 WO2017112416A1 (en) 2015-12-20 2016-12-08 Light detection and ranging sensor

Publications (1)

Publication Number Publication Date
EP3391076A1 true EP3391076A1 (en) 2018-10-24

Family

ID=57570664

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16813340.3A Ceased EP3391076A1 (en) 2015-12-20 2016-12-08 Light detection and ranging sensor

Country Status (4)

Country Link
EP (1) EP3391076A1 (ja)
JP (2) JP6644892B2 (ja)
CN (2) CN111239708B (ja)
WO (1) WO2017112416A1 (ja)

Families Citing this family (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9741754B2 (en) 2013-03-06 2017-08-22 Apple Inc. Charge transfer circuit with storage nodes in image sensors
US9686485B2 (en) 2014-05-30 2017-06-20 Apple Inc. Pixel binning in an image sensor
US10761195B2 (en) 2016-04-22 2020-09-01 OPSYS Tech Ltd. Multi-wavelength LIDAR system
US10438987B2 (en) 2016-09-23 2019-10-08 Apple Inc. Stacked backside illuminated SPAD array
US10917626B2 (en) 2016-11-23 2021-02-09 Microsoft Technology Licensing, Llc Active illumination 3D imaging system
US10605984B2 (en) 2016-12-01 2020-03-31 Waymo Llc Array of waveguide diffusers for light detection using an aperture
CN110235024B (zh) 2017-01-25 2022-10-28 苹果公司 具有调制灵敏度的spad检测器
US10656251B1 (en) 2017-01-25 2020-05-19 Apple Inc. Signal acquisition in a SPAD detector
US10962628B1 (en) 2017-01-26 2021-03-30 Apple Inc. Spatial temporal weighting in a SPAD detector
KR102592139B1 (ko) 2017-03-13 2023-10-23 옵시스 테크 엘티디 눈-안전 스캐닝 lidar 시스템
AU2018297291B2 (en) * 2017-07-05 2024-03-07 Ouster, Inc. Light ranging device with electronically scanned emitter array and synchronized sensor array
EP3428574A1 (en) * 2017-07-11 2019-01-16 Fondazione Bruno Kessler Device for measuring a distance and method for measuring said distance
US10901073B2 (en) 2017-07-11 2021-01-26 Microsoft Technology Licensing, Llc Illumination for zoned time-of-flight imaging
US10430958B2 (en) * 2017-07-11 2019-10-01 Microsoft Technology Licensing, Llc Active illumination 3D zonal imaging system
US20190018119A1 (en) * 2017-07-13 2019-01-17 Apple Inc. Early-late pulse counting for light emitting depth sensors
US10622538B2 (en) 2017-07-18 2020-04-14 Apple Inc. Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body
JP6865492B2 (ja) 2017-07-28 2021-04-28 オプシス テック リミテッド 小角度発散を伴うvcselアレイlidar送信機
US10698088B2 (en) 2017-08-01 2020-06-30 Waymo Llc LIDAR receiver using a waveguide and an aperture
US10677899B2 (en) 2017-08-07 2020-06-09 Waymo Llc Aggregating non-imaging SPAD architecture for full digital monolithic, frame averaging receivers
EP3451021A1 (de) * 2017-08-30 2019-03-06 Hexagon Technology Center GmbH Messgerät mit scanfunktionalität und einstellbaren empfangsbereichen des empfängers
US10440301B2 (en) 2017-09-08 2019-10-08 Apple Inc. Image capture device, pixel, and method providing improved phase detection auto-focus performance
US10473923B2 (en) * 2017-09-27 2019-11-12 Apple Inc. Focal region optical elements for high-performance optical scanners
KR102589319B1 (ko) 2017-11-15 2023-10-16 옵시스 테크 엘티디 잡음 적응형 솔리드-스테이트 lidar 시스템
DE102018203534A1 (de) * 2018-03-08 2019-09-12 Ibeo Automotive Systems GmbH Empfängeranordnung zum Empfang von Lichtimpulsen, LiDAR-Modul und Verfahren zum Empfangen von Lichtimpulsen
CN111919137A (zh) 2018-04-01 2020-11-10 欧普赛斯技术有限公司 噪声自适应固态lidar系统
WO2019197717A1 (en) 2018-04-09 2019-10-17 Oulun Yliopisto Range imaging apparatus and method
JP2019191126A (ja) 2018-04-27 2019-10-31 シャープ株式会社 光レーダ装置
DE102018113848A1 (de) * 2018-06-11 2019-12-12 Sick Ag Optoelektronischer Sensor und Verfahren zur Erfassung von dreidimensionalen Bilddaten
US10848693B2 (en) 2018-07-18 2020-11-24 Apple Inc. Image flare detection using asymmetric pixels
US11019294B2 (en) 2018-07-18 2021-05-25 Apple Inc. Seamless readout mode transitions in image sensors
EP3608688B1 (en) * 2018-08-09 2021-01-27 OMRON Corporation Distance measuring device
EP3620822A1 (en) 2018-09-06 2020-03-11 STMicroelectronics (Research & Development) Limited Non-contiguous layouts for photosensitive apparatus
US11914078B2 (en) * 2018-09-16 2024-02-27 Apple Inc. Calibration of a depth sensing array using color image data
US11237256B2 (en) * 2018-09-19 2022-02-01 Waymo Llc Methods and systems for dithering active sensor pulse emissions
EP3857259A1 (en) * 2018-09-25 2021-08-04 Apple Inc. Enhanced depth mapping using visual inertial odometry
US11233966B1 (en) 2018-11-29 2022-01-25 Apple Inc. Breakdown voltage monitoring for avalanche diodes
WO2020121959A1 (ja) * 2018-12-14 2020-06-18 株式会社デンソー 光学的測距装置、レーザ発光装置およびその製造方法
JP7172963B2 (ja) * 2018-12-14 2022-11-16 株式会社デンソー 光学的測距装置、レーザ発光装置の製造方法
DE102018222777A1 (de) * 2018-12-21 2020-06-25 Robert Bosch Gmbh Optoelektronischer Sensor und Verfahren zum Betreiben eines optoelektronischen Sensors
JP2020106339A (ja) * 2018-12-26 2020-07-09 ソニーセミコンダクタソリューションズ株式会社 測定装置および測距装置
US11768275B2 (en) * 2019-01-31 2023-09-26 The University Court Of The University Of Edinburgh Strobe window dependent illumination for flash LIDAR
KR102604902B1 (ko) * 2019-02-11 2023-11-21 애플 인크. 펄스형 빔들의 희소 어레이를 사용하는 깊이 감지
CN114942454A (zh) * 2019-03-08 2022-08-26 欧司朗股份有限公司 用于lidar传感器系统的光学封装以及lidar传感器系统
JP7337517B2 (ja) * 2019-03-14 2023-09-04 株式会社東芝 光検出器及び距離測定装置
US11796642B2 (en) 2019-03-26 2023-10-24 Infineon Technologies Ag Oversamplng and transmitter shooting pattern for light detection and ranging (LIDAR) system
WO2020210176A1 (en) 2019-04-09 2020-10-15 OPSYS Tech Ltd. Solid-state lidar transmitter with laser control
CN110109085B (zh) * 2019-04-15 2022-09-30 东南大学 基于双模切换的低功耗宽量程阵列型光子计时读出电路
US11320535B2 (en) 2019-04-24 2022-05-03 Analog Devices, Inc. Optical system for determining interferer locus among two or more regions of a transmissive liquid crystal structure
JP7259525B2 (ja) * 2019-04-26 2023-04-18 株式会社デンソー 光測距装置およびその方法
US11480685B2 (en) * 2019-05-05 2022-10-25 Apple Inc. Compact optical packaging of LiDAR systems using diffractive structures behind angled interfaces
CN110068808A (zh) * 2019-05-29 2019-07-30 南京芯视界微电子科技有限公司 激光雷达的接收机装置及激光雷达
CN113906316A (zh) 2019-05-30 2022-01-07 欧普赛斯技术有限公司 使用致动器的眼睛安全的长范围lidar系统
JP7438564B2 (ja) 2019-06-10 2024-02-27 オプシス テック リミテッド 眼に安全な長距離固体lidarシステム
JP2021015095A (ja) * 2019-07-16 2021-02-12 パイオニア株式会社 測距装置
DE102019211739A1 (de) * 2019-08-06 2021-02-11 Ibeo Automotive Systems GmbH Lidar-Messsystem mit zwei Lidar-Messvorrichtungen
JP2021039069A (ja) * 2019-09-05 2021-03-11 株式会社東芝 光検出装置、電子装置及び光検出方法
JP2021043131A (ja) * 2019-09-13 2021-03-18 ソニーセミコンダクタソリューションズ株式会社 距離測定装置及び該装置における測距機構のずれ調整方法
CN110596724B (zh) * 2019-09-19 2022-07-29 深圳奥锐达科技有限公司 动态直方图绘制飞行时间距离测量方法及测量系统
CN110596721B (zh) * 2019-09-19 2022-06-14 深圳奥锐达科技有限公司 双重共享tdc电路的飞行时间距离测量系统及测量方法
CN110687541A (zh) * 2019-10-15 2020-01-14 深圳奥锐达科技有限公司 一种距离测量系统及方法
CN110780312B (zh) * 2019-10-15 2022-10-21 深圳奥锐达科技有限公司 一种可调距离测量系统及方法
JP2021071458A (ja) * 2019-11-01 2021-05-06 ソニーセミコンダクタソリューションズ株式会社 受光装置、測距装置および受光回路
CN111090104B (zh) * 2019-12-26 2022-11-11 维沃移动通信有限公司 成像处理方法和电子设备
CN113126104A (zh) * 2019-12-27 2021-07-16 精准基因生物科技股份有限公司 一种飞时偏光感测系统及其光发射器
JPWO2021161858A1 (ja) * 2020-02-14 2021-08-19
CN113359142A (zh) * 2020-03-06 2021-09-07 上海禾赛科技有限公司 激光雷达及其测距方法
JP7434002B2 (ja) * 2020-03-17 2024-02-20 株式会社東芝 光検出器及び距離計測装置
CN113447933A (zh) * 2020-03-24 2021-09-28 上海禾赛科技有限公司 激光雷达的探测单元、激光雷达及其探测方法
CN111352095A (zh) * 2020-04-17 2020-06-30 深圳市镭神智能系统有限公司 一种激光雷达接收系统及激光雷达
CN111610534B (zh) * 2020-05-07 2022-12-02 广州立景创新科技有限公司 成像装置及成像方法
US11476372B1 (en) 2020-05-13 2022-10-18 Apple Inc. SPAD-based photon detectors with multi-phase sampling TDCs
CN113970757A (zh) * 2020-07-23 2022-01-25 华为技术有限公司 一种深度成像方法及深度成像系统
CN114063043A (zh) * 2020-07-30 2022-02-18 北京一径科技有限公司 光电探测阵列的控制方法及装置、光电电源开关电路、光电探测阵列
JP7476033B2 (ja) * 2020-08-24 2024-04-30 株式会社東芝 受光装置及び電子装置
JP7434115B2 (ja) 2020-09-07 2024-02-20 株式会社東芝 光検出器及び距離計測装置
JP7423485B2 (ja) 2020-09-18 2024-01-29 株式会社東芝 距離計測装置
CN112346075B (zh) * 2020-10-01 2023-04-11 奥比中光科技集团股份有限公司 一种采集器及光斑位置追踪方法
WO2022201502A1 (ja) * 2021-03-26 2022-09-29 パイオニア株式会社 センサ装置
WO2022201501A1 (ja) * 2021-03-26 2022-09-29 パイオニア株式会社 センサ装置
JP7443287B2 (ja) 2021-06-09 2024-03-05 株式会社東芝 光検出器及び距離計測装置
CN115980763A (zh) * 2021-10-15 2023-04-18 华为技术有限公司 探测方法及装置
JP2023066231A (ja) * 2021-10-28 2023-05-15 株式会社デンソー 制御装置、制御方法、制御プログラム
WO2023149242A1 (ja) * 2022-02-03 2023-08-10 株式会社小糸製作所 測定装置
CN116184436B (zh) * 2023-03-07 2023-11-17 哈尔滨工业大学 阵列轨道角动量穿云透雾量子探测成像系统

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02287113A (ja) * 1989-04-27 1990-11-27 Asahi Optical Co Ltd 測距装置
JPH0567195A (ja) * 1991-09-05 1993-03-19 Matsushita Electric Ind Co Ltd 形状測定装置
JPH10170637A (ja) * 1996-12-16 1998-06-26 Omron Corp 光走査装置
JPH1163920A (ja) * 1997-08-26 1999-03-05 Matsushita Electric Works Ltd 光走査式変位測定装置
JP3832101B2 (ja) * 1998-08-05 2006-10-11 株式会社デンソー 距離測定装置
DE60336534D1 (de) * 2002-01-11 2011-05-12 Gen Hospital Corp Vorrichtung zur OCT Bildaufnahme mit axialem Linienfokus für verbesserte Auflösung und Tiefenschärfe
JP2007190566A (ja) * 2006-01-17 2007-08-02 Miyachi Technos Corp ファイバレーザ加工装置
CN101401107B (zh) * 2006-04-11 2013-01-16 数据逻辑Adc公司 使用光栅扫描进行数据读取的方法
GB2439962B (en) * 2006-06-14 2008-09-24 Exitech Ltd Process and apparatus for laser scribing
CN201054040Y (zh) * 2007-05-21 2008-04-30 一品光学工业股份有限公司 微机电摆动激光扫描装置
US8786682B2 (en) * 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
EP2446301B1 (en) * 2009-06-22 2018-08-01 Toyota Motor Europe Pulsed light optical rangefinder
US8502926B2 (en) * 2009-09-30 2013-08-06 Apple Inc. Display system having coherent and incoherent light sources
JP2011089874A (ja) * 2009-10-22 2011-05-06 Toyota Central R&D Labs Inc 距離画像データ取得装置
LU91688B1 (en) * 2010-05-17 2011-11-18 Iee Sarl Scanning 3D imager
WO2012141868A1 (en) * 2011-04-15 2012-10-18 Faro Technologies, Inc. Enhanced position detector in laser tracker
WO2013028691A1 (en) * 2011-08-25 2013-02-28 Georgia Tech Research Corporation Gas sensors and methods of preparation thereof
JP2013113669A (ja) * 2011-11-28 2013-06-10 Mitsubishi Electric Corp レーザレーダ装置
KR102038533B1 (ko) * 2012-06-14 2019-10-31 한국전자통신연구원 레이저 레이더 시스템 및 목표물 영상 획득 방법
EP2708914A1 (de) * 2012-09-18 2014-03-19 Sick Ag Optoelektronischer Sensor und Verfahren zur Erfassung einer Tiefenkarte
CN105209869B (zh) * 2012-10-23 2019-12-13 苹果公司 由光谱仪辅助的特殊设计的图案闭环校准的高精度成像色度计
DE202013101039U1 (de) * 2013-03-11 2014-03-12 Sick Ag Optoelektronischer Sensor zur Entfernungsmessung
EP2972081B1 (en) * 2013-03-15 2020-04-22 Apple Inc. Depth scanning with multiple emitters
JP6483725B2 (ja) * 2014-04-07 2019-03-13 サムスン エレクトロニクス カンパニー リミテッド 光学的イベントを感知する方法とそのための光学的イベントセンサ、及び距離測定モバイル装置
CN103983979B (zh) * 2014-05-27 2016-05-11 中国科学院上海光学精密机械研究所 基于m序列相位编码和正交偏振复用的合成孔径激光成像雷达

Also Published As

Publication number Publication date
WO2017112416A1 (en) 2017-06-29
JP2020073901A (ja) 2020-05-14
JP6899005B2 (ja) 2021-07-07
JP2018537680A (ja) 2018-12-20
CN108431626B (zh) 2022-06-17
CN108431626A (zh) 2018-08-21
JP6644892B2 (ja) 2020-02-12
CN111239708A (zh) 2020-06-05
CN111239708B (zh) 2024-01-09

Similar Documents

Publication Publication Date Title
US10795001B2 (en) Imaging system with synchronized scan and sensing
US10324171B2 (en) Light detection and ranging sensor
JP6644892B2 (ja) 光検出測距センサ
US11681027B2 (en) Time-of-flight depth mapping with parallax compensation
US10775507B2 (en) Adaptive transmission power control for a LIDAR
EP3704510B1 (en) Time-of-flight sensing using an addressable array of emitters
US20180081041A1 (en) LiDAR with irregular pulse sequence
KR102409952B1 (ko) 고해상도, 고프레임률, 저전력 이미지 센서
US7248344B2 (en) Surface profile measurement
US10955552B2 (en) Waveform design for a LiDAR system with closely-spaced pulses
JP2019215324A (ja) 光電センサ及び距離測定方法
IL258130A (en) Flight time distance sensor
US20190310370A1 (en) Optoelectronic sensor and method for detection and distance determination of objects
Ruokamo et al. An $80\times25 $ Pixel CMOS Single-Photon Sensor With Flexible On-Chip Time Gating of 40 Subarrays for Solid-State 3-D Range Imaging
US11698447B2 (en) Beam steering aware pixel clustering of segmented sensor area and implementing averaging algorithms for pixel processing
JP7423485B2 (ja) 距離計測装置
TW202002626A (zh) 結構光成像系統及結構光成像系統掃描場景的方法
US11681028B2 (en) Close-range measurement of time of flight using parallax shift

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180711

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210510

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20231219