CN112955776A - Adjustable pulse characteristics for ground detection in a lidar system - Google Patents

Adjustable pulse characteristics for ground detection in a lidar system Download PDF

Info

Publication number
CN112955776A
CN112955776A CN201980057112.1A CN201980057112A CN112955776A CN 112955776 A CN112955776 A CN 112955776A CN 201980057112 A CN201980057112 A CN 201980057112A CN 112955776 A CN112955776 A CN 112955776A
Authority
CN
China
Prior art keywords
lidar system
light
view
scan
pulse
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980057112.1A
Other languages
Chinese (zh)
Inventor
J·M·艾兴霍尔兹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luminol LLC
Luminar Technologies Inc
Original Assignee
Luminol LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luminol LLC filed Critical Luminol LLC
Publication of CN112955776A publication Critical patent/CN112955776A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/26Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein the transmitted pulses use a frequency-modulated or phase-modulated carrier wave, e.g. for pulse compression of received signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4812Constructional features, e.g. arrangements of optical elements common to transmitter and receiver transmitted and received beams following a coaxial path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4818Constructional features, e.g. arrangements of optical elements using optical fibres
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4868Controlling received signal intensity or exposure of sensor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/105Scanning systems with one or more pivoting mirrors or galvano-mirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/12Scanning systems using multifaceted mirrors

Abstract

A method in a lidar system for scanning a field of view of the lidar system is provided. The method includes identifying, within a field of view, a portion of the ground that overlaps an area of the ground in front of a lidar system; causing a light source to emit pulses of light; scanning at least a portion of the pulses of emitted light along a scanning pattern contained within the field of view, including adjusting scanning parameters such that at least one of a resolution or a pulse energy of a ground portion of the field of view is modified relative to another portion of the field of view; and detecting at least a portion of the scanned pulses of light scattered by the one or more remote targets.

Description

Adjustable pulse characteristics for ground detection in a lidar system
Cross Reference to Related Applications
This application claims priority to U.S. patent application No.16/040,263 filed on 2019, 7/19, the entire contents of which are incorporated herein by reference.
Technical Field
The present disclosure relates generally to lidar systems and, more particularly, to lidar systems that vary scanning parameters when scanning portions where a field of view overlaps the ground.
Background
The background description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description that may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
Light detection and ranging (lidar) is a technique that can be used to measure distance to a remote target. In general, a lidar system includes a light source and a light receiver. The light source may be, for example, a laser that emits light having a particular operating wavelength. The operating wavelength of the lidar system may be in the infrared, visible, or ultraviolet portions of the electromagnetic spectrum, for example. The light source emits light toward the target, which then scatters the light. Some of the scattered light is received back at the receiver. The system determines a distance to the target based on one or more characteristics associated with the returned light. For example, the system may determine the distance to the target based on the time of flight of the returned light pulse.
A typical lidar system is configured to operate for some fixed period of time tmaxWhile waiting for scattered light to return tmaxCorresponding to the time it takes for a light pulse to travel to the maximum distance that the lidar system is configured to detect a target and return. For example, a lidar system may be configured to detect objects up to 200 meters away, and thus tmaxMay be approximately 1.33 microseconds. If the laser radar system is not at tmaxScattered light is detected internally, the lidar system will conclude that no object is scattering the outward pulse and generate the next light pulse.
Disclosure of Invention
In order to improve resolution and/or dynamically change laser power when certain properties of the target are known, a lidar system operating in a ground vehicle (e.g., car, truck, agricultural vehicle) determines where the vehicle's field of view overlaps the ground area and adjusts one or more characteristics of the scan for this portion of the field of view. More specifically, the lidar system may adjust the scan rate and/or the power of the outward light pulses. The change in resolution may include a change in pixel density along the horizontal dimension and/or a change in line density (corresponding to pixel density along the vertical dimension).
One example embodiment of these techniques is a lidar system that includes a light source configured to emit pulses of light, a scanner configured to scan at least a portion of the emitted pulses of light along a scanning pattern contained within a field of view of the lidar system, a receiver configured to detect at least a portion of the scanned pulses of light scattered by one or more remote targets, and a processor. The energy field of view includes a portion of the ground that overlaps an area of the ground in front of the lidar system. The processor is configured to identify a ground portion of the field-of-view, and adjust the scan parameters so as to modify at least one of a resolution or an pulse energy of the ground portion of the field-of-view relative to another portion of the field-of-view when the transmitted pulse scans the ground portion of the field-of-view during a subsequent scan of the field-of-view.
Another example implementation of these techniques is a method in a lidar system for scanning a field of view of the lidar system. The method includes identifying a portion of the ground within the field of view that overlaps an area of the ground in front of the lidar system; causing a light source to emit pulses of light; scanning at least a portion of the pulses of emitted light along a scanning pattern contained within the field of view, including adjusting scanning parameters to modify at least one of a resolution or a pulse energy of the ground portion of the field of view relative to another portion of the field of view; and detecting at least a portion of the scanned pulses of light scattered by the one or more remote targets.
Yet another example embodiment of these techniques is an autonomous vehicle including a vehicle steering assembly to at least effect steering, acceleration, and braking of the autonomous vehicle. The autonomous vehicle further includes a lidar system comprising: the system includes a light source configured to emit pulses of light, a scanner configured to scan at least a portion of the emitted pulses of light along a scanning pattern contained within a field of view of the lidar system (where the field of view includes a ground portion that overlaps an area of the ground in front of the lidar system), and a receiver configured to detect at least a portion of the scanned pulses of light scattered by one or more remote targets. The autonomous vehicle also includes a vehicle controller communicatively coupled to the vehicle handling assembly and the lidar system, the vehicle controller configured to control the vehicle handling assembly using signals generated by the lidar system. The lidar system is configured to adjust a scanning parameter when the transmitted pulse scans a ground portion of the visible area during a subsequent scan of the visible area so as to modify at least one of a resolution or a pulse energy of the ground portion of the visible area relative to another portion of the visible area.
Another example embodiment of these techniques is a method in a lidar system for determining the reflectivity of an area within the field of view. The method includes causing a light source to emit pulses of light and scanning the emitted pulses of light over a field of view along a scanning pattern. The method further comprises the following steps: the method includes determining an amount of emitted light directed toward a region of interest contained within a field of view during a scan, determining an amount of light scattered by one or more targets placed in the region of interest, and determining a physical property of the region of interest using the determined amount of emitted light and the determined amount of scattered light. In one embodiment, the physical property is an estimate of the reflectivity of the region of interest. In one embodiment, the method further includes determining whether the region of interest is a ground area within the field of view based on the determining the reflectivity. In one embodiment, determining the amount of emitted light comprises determining a total amount of light contained in a plurality of emission pulses directed towards the region of interest according to a scanning pattern. In one embodiment, determining the amount of scattered light includes determining a total amount of light contained in a plurality of returns corresponding to a plurality of transmit pulses directed toward the region of interest according to a scan pattern. In some embodiments, determining the amount of scattered light comprises, in embodiments, eliminating statistical outliers from multiple returns or absence of returns corresponding to some emitted pulses.
Drawings
FIG. 1 is a block diagram of an example light detection and ranging (lidar) system in which techniques of the present disclosure may be implemented;
FIG. 2 illustrates in more detail several components that may operate in the system of FIG. 1;
FIG. 3 is a block diagram of a lidar system in which a scanner includes a polygon mirror;
FIG. 4 illustrates an example configuration in which the assembly of FIG. 1 scans a 360 degree field of view through a window in a rotating housing;
FIG. 5 illustrates another configuration in which the assembly of FIG. 1 scans a 360 degree field of view through a substantially transparent stationary housing;
FIG. 6 illustrates an example scan pattern that may be produced by the lidar system of FIG. 1 when identifying a target within a field of view;
FIG. 7 illustrates an example scan pattern that may be produced by the lidar system of FIG. 1 when multiple beams are used to identify a target within the field of view;
FIG. 8 schematically illustrates the field of view (FOV) of light sources and detectors that may be operated in the lidar system of FIG. 1;
FIG. 9 illustrates an example configuration of the lidar system of FIG. 1, or another suitable lidar system, in which a laser is located remotely from a sensor assembly.
FIG. 10 shows an example vehicle in which the lidar system of FIG. 1 may be operated;
FIG. 11 shows an example InGaAs avalanche photodiode that may be operated in the lidar system of FIG. 1;
FIG. 12 shows an example photodiode coupled to a pulse detection circuit that may operate in the lidar system of FIG. 1;
fig. 13 illustrates an exemplary light source including a seed laser and amplifier that may operate in the lidar system of fig. 1.
FIG. 14 illustrates an example implementation of the amplifier of FIG. 13;
FIG. 15 illustrates an example scenario within the visibility region of the lidar system of FIG. 1 operating in a vehicle, where the visibility region overlaps with a ground area in front of the vehicle;
FIG. 16 is a flow diagram of an example method for adjusting one or more scan parameters to increase resolution and/or modify pulse power while scanning the ground in front of a vehicle;
FIG. 17 illustrates detection of a ground portion of a field of view when a vehicle in which the lidar system is operating is traveling on a road having a downward slope;
FIG. 18 illustrates detection of a ground portion of a field of view when a vehicle in which the lidar system is operating is traveling on a road having an upward slope;
FIG. 19 illustrates an example scan pattern with increased line density in a ground portion of a viewable area;
FIG. 20 illustrates an exemplary scanning pattern with increased horizontal resolution in a ground portion of a field-of-view;
FIG. 21 illustrates another example selection of one or more scanning parameters for scanning a ground portion of a field of view; and
fig. 22 is a timing diagram of an example technique for transmitting a pulse of light upon detection of a return pulse, which may be implemented in a lidar system of the present disclosure.
Detailed Description
SUMMARY
A lidar system configured to operate in a land vehicle determines a location where an energy field of view of the lidar system overlaps a ground area in front of the lidar system ("ground area") based on data previously collected by the lidar system or an indication from another sensor, and adjusts one or more scanning parameters to scan a corresponding portion of the energy field of view ("ground portion").
In some cases, the lidar system uses data acquired during one or more previous scans of the field of view to determine the portion of the ground that is visible. Lidar systems may also use data from a camera (e.g., a CCD or CMOS camera), an acoustic array, or other suitable sensor or combination of sensors. Furthermore, the lidar system may use the positioning data together with terrain data (pre-stored for the relevant location or received via a communication network) from the GIS system, and also taking into account the location where the lidar system is installed in the vehicle.
The one or more scan parameters that the lidar system may modify include scan line density, horizontal resolution, pixel density, pulse rate or pulse repetition frequency, pulse energy, and the like. For example, the lidar system may change these scanning parameters by modifying the operation of the light source and/or the scanner. By adjusting one or more scan parameters for the ground portion of the field of view, the lidar system may generate a high resolution scan of a portion of the field of view and/or more efficiently distribute laser power over the field of view.
Thus, lidar systems may provide additional horizontal resolution for many driving scenarios where desired. For example, when scanning a road, the additional horizontal resolution helps to identify road markings, lanes, potholes, mirrors, or other objects located on or associated with the road. The lidar system may also provide additional laser power in those driving scenarios when additional laser power is required. For example, when a light pulse emitted by a lidar system strikes a road at a sufficiently low glancing angle (also known as a sweep angle), the light pulse will be reflected by the road and will scatter little light.
Exemplary lidar System
Fig. 1 illustrates an exemplary light detection and ranging (lidar) system 100. Laser radar system 100 may be referred to as a laser ranging system, a laser radar system, a LIDAR system, a laser radar sensor, or a laser detection and ranging (LADAR or LADAR) system. Lidar system 100 may include a light source 110, an optical coupling assembly 113 including a mirror 115, a scanner 120, a receiver 140, and a controller 150. The light source 110 may, for example, be a laser that emits light having a particular operating wavelength in the infrared, visible, or ultraviolet portions of the electromagnetic spectrum. As a more specific example, light source 110 may include a laser operating at a wavelength between about 1.2 μm and 1.7 μm.
In operation, the optical source 110 emits an output optical beam 125, which may be a Continuous Wave (CW), pulsed, or modulated in any suitable manner for a given application. In the present disclosure, the emitted light may be described as pulses of light. In some embodiments, the duration of a pulse may be as long as the time interval until the next transmitted pulse. In some embodiments, the pulse may exhibit a change in light intensity and/or a change in light frequency throughout the pulse duration. Thus, for example, in a frequency modulated cw (fmcw) lidar system, a pulse may be defined by the entire period of the modulation frequency, even if the intensity remains constant within a pulse or from pulse to pulse. Output beam 125 is directed anteriorly toward a remote target 130, which remote target 130 is a distance D from lidar system 100 and is at least partially contained within the field of view of system 100. Depending on the scenario and/or implementation of lidar system 100, D may be between 1 meter and 1 kilometer, for example.
Once output beam 125 reaches forward target 130, target 130 may scatter or, in some cases, reflect at least a portion of the light from output beam 125, and some of the scattered or reflected light may return to lidar system 100. In the example of fig. 1, the scattered or reflected light is represented by an input beam 135, the input beam 135 passing through the scanner 120, and the scanner 120 may be referred to as a beam scanner, an optical scanner, or a laser scanner. The input beam 135 passes through the scanner 120 to the mirror 115, and the mirror 115 may be referred to as a fold mirror, a stack mirror, or a beam combiner. The mirror 115 will then outputThe incoming beam 135 is directed to a receiver 140. The input beam 135 may contain only a relatively small portion of the light from the output beam 125. For example, the ratio of the average power, peak power, or pulse energy of the input beam 135 to the average power, peak power, or pulse energy of the output beam 125 may be about 10-1、10-2、10-3、10-4、10-5、10-6、10-7、10-8、10-9、10-10、10-11Or 10-12. As another example, if a pulse of the output beam 125 has a pulse energy of 1 micro joule (μ J), the pulse energy of a corresponding pulse of the input beam 135 may have a pulse energy of about 10 nanojoules (nJ), 1nJ, 100 picojoules (pJ), 10pJ, 1pJ, 100 femtojoules (fJ), 10fJ, 1fJ, 100 atrojoules (aJ), 10aJ, or 1 aJ.
Output beam 125 may be referred to as a laser beam, a light beam, an optical beam, an emitted beam, or simply a beam; and the input beam 135 may be referred to as a return beam, a received beam, a return light, a received light, an input light, a scattered light, or a reflected light. As used herein, scattered light may refer to light scattered or reflected by the target 130. Input beam 135 may include light from output beam 125 scattered by target 130, light from output beam 125 reflected by target 130, or a combination of scattered and reflected light from target 130.
The operating wavelength of lidar system 100 may be in the infrared, visible, or ultraviolet portions of the electromagnetic spectrum, for example. The sun also produces light in these wavelength ranges, and thus sunlight may act as background noise that may mask the signal light detected by lidar system 100. Such solar background noise may cause false positive detections or may otherwise corrupt measurements of lidar system 100, particularly when receiver 140 includes a SPAD detector (which may be highly sensitive).
In general, light from the sun that passes through the earth's atmosphere and reaches a ground-based lidar system (e.g., system 100) may establish an optical background noise floor for the system. Thus, in order to be able to detect a signal from lidar system 100, the signal must rise above the background noise floor. The signal-to-noise ratio (SNR) of lidar system 100 may generally be increased by increasing the power level of output beam 125, but in some cases it may be desirable to keep the power level of output beam 125 relatively low. For example, increasing the transmission power level of output beam 125 may result in laser radar system 100 being eye-safe.
In some embodiments, lidar system 100 operates at one or more wavelengths between about 1400nm and about 1600 nm. For example, the light source 110 may generate light at about 1550 nm.
In some embodiments, lidar system 100 operates at frequencies where atmospheric absorption is relatively low. For example, laser radar system 100 may operate at wavelengths in the range of approximately 980nm to 1110nm or 1165nm to 1400 nm.
In other embodiments, lidar system 100 operates at frequencies where the atmosphere absorbs higher. For example, laser radar system 100 may operate at wavelengths in the range of approximately 930nm to 980nm, 1100nm to 1165nm, or 1400nm to 1460 nm.
According to some embodiments, lidar system 100 may include an eye-safe laser, or lidar system 100 may be classified as an eye-safe laser system or laser product. An eye-safe laser, laser system, or laser product may refer to a system having an emission wavelength, average power, peak intensity, pulse energy, beam size, beam divergence, exposure time, or output beam that is scanned such that light emitted from the system causes little or no damage to a human eye. For example, light source 110 or lidar system 100 may be classified as a class 1 laser product (as specified by the 60825-1 standard of the International Electrotechnical Commission (IEC)) or a class I laser product (as specified by part 1040.10 of the united states Code of Federal Regulations (CFR) 21), which is safe under all conditions of normal use. In some embodiments, lidar system 100 may be classified as an eye-safe laser product (e.g., having a class 1 or class I classification) configured to operate at any suitable wavelength between about 1400nm and about 2100 nm. In some embodiments, light source 110 may include a laser having an operating wavelength between about 1400nm and about 1600nm, and lidar system 100 may operate in an eye-safe manner. In some embodiments, light source 110 or lidar system 100 may be an eye-safe laser product that includes a scanning laser having an operating wavelength between about 1530nm and about 1560 nm. In some embodiments, lidar system 100 may be a class 1 or class I laser product that includes a fiber laser or solid state laser operating at a wavelength between about 1400nm and about 1600 nm.
The receiver 140 may receive or detect photons from the input beam 135 and generate one or more representative signals. For example, the receiver 140 may generate an output electrical signal 145 representative of the input optical beam 135. The receiver may send the electrical signal 145 to the controller 150. According to an implementation, controller 150 may include one or more processors, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), and/or other suitable circuitry configured to analyze one or more characteristics of electrical signal 145 to determine one or more characteristics of target 130, such as its forward distance from lidar system 100. More specifically, the controller 150 may analyze the time-of-flight or phase modulation of the light beam 125 transmitted by the light source 110. If lidar system 100 measures a time of flight T (e.g., T represents a round-trip time of flight for a pulse of transmitted light to travel from lidar system 100 to target 130 and back to lidar system 100), a distance D from target 130 to lidar system 100 may be represented as D ═ c · T/2, where c is the speed of light (about 3.0 × 10)8m/s)。
As a more specific example, if laser radar system 100 measures a time-of-flight of T300 ns, laser radar system 100 may determine that the distance from target 130 to laser radar system 100 is approximately D45.0 m. As another example, lidar system 100 measures a time-of-flight of T1.33 μ s, and thus determines that the distance from target 130 to lidar system 100 is approximately D199.5 m. Distance D from lidar system 100 to target 130 may be referred to asDistance, depth, or range of target 130. As used herein, the speed of light c refers to the speed of light in any suitable medium (e.g., in air, water, or vacuum). The speed of light in vacuum is about 2.9979X 108m/s, and the speed of light in air (refractive index of about 1.0003) is about 2.9970 × 108m/s。
Target 130 may be located at a distance D from lidar system 100 that is less than or equal to a maximum range R of lidar system 100MAX. Maximum range R of laser radar system 100MAX(which may also be referred to as maximum range) may correspond to the maximum range at which lidar system 100 is configured to sense or identify a target present in the field-of-view of lidar system 100. The maximum range of lidar system 100 may be any suitable distance, for example 25m, 50m, 100m, 200m, 500m, or 1 km. As a particular example, a laser radar system with a maximum range of 200m may be configured to sense or identify various targets up to 200m away. For a maximum range (R) of 200mMAX200m), the time of flight corresponding to this maximum range is approximately
Figure BDA0002955748760000091
In some embodiments, light source 110, scanner 120, and receiver 140 may be packaged together within a single housing 155, which single housing 155 may be a box, case, or enclosure that houses or contains all or part of lidar system 100. The housing 155 includes a window 157 through which the light beams 125 and 135 pass. In an example embodiment, lidar system housing 155 contains light source 110, overlap mirror 115, scanner 120, and receiver 140 of lidar system 100. The controller 150 may be located within the same housing 155 as the components 110, 120, and 140, or the controller 150 may be located remotely from the housing.
Further, in some embodiments, housing 155 includes a plurality of lidar sensors, each including a respective scanner and receiver. Depending on the particular implementation, each of the plurality of sensors may include a separate light source or a common light source. Depending on the implementation, multiple sensors may be configured to cover non-overlapping adjacent or partially overlapping energy views.
The housing 155 may be an airtight or watertight structure that prevents water vapor, liquid water, dirt, dust, or other contaminants from entering the housing 155. The housing 155 may be filled with a dry or inert gas, such as dry air, nitrogen, or argon. The housing 155 may include one or more electrical connections for transferring electrical power or signals to and/or from the housing.
The window 157 may be made of any suitable substrate material, such as glass or plastic (e.g., polycarbonate, acrylic, cyclic olefin polymer, or cyclic olefin copolymer). The window 157 may include an inner surface (surface a) and an outer surface (surface B), and either surface a or surface B may include a dielectric coating having a particular reflectivity value at a particular wavelength. The dielectric coating (which may be referred to as a thin film coating, interference coating or coating) may include one or more layers of dielectric material (e.g., SiO) having a particular thickness (e.g., thickness less than 1 μm) and a particular refractive index2、TiO2、Al2O3、Ta2O5、MgF2、LaF3Or AlF3) A thin film layer of (2). The dielectric coating can be deposited on surface a or surface B of the window 157 using any suitable deposition technique, such as sputtering or electron beam deposition.
The dielectric coating may have a high reflectivity at a particular wavelength or a low reflectivity at a particular wavelength. The High Reflectivity (HR) dielectric coating can have any suitable reflectivity value (e.g., greater than or equal to 80%, 90%, 95%, or 99% reflectivity) at any suitable wavelength or combination of wavelengths. The low-reflectivity dielectric coating, which may be referred to as an Antireflective (AR) coating, may have any suitable reflectivity value (e.g., a reflectivity of less than or equal to 5%, 2%, 1%, 0.5%, or 0.2%) at any suitable wavelength or combination of wavelengths. In particular embodiments, the dielectric coating may be a dichroic coating having a particular combination of high or low reflectivity values at particular wavelengths. For example, the dichroic coating may have a reflectance of less than or equal to 0.5% at approximately 1550-.
In some embodiments, surface a or surface B has a dielectric coating that is anti-reflective at the operating wavelength of the one or more light sources 110 contained within the housing 155. AR coatings on surfaces a and B may increase the amount of light transmitted through window 157 at the operating wavelength of light source 110. Additionally, the AR coating at the operating wavelength of light source 110 may reduce the amount of incident light from output beam 125 that is reflected back into housing 155 by window 157. In one exemplary embodiment, both surface a and surface B have AR coatings with a reflectivity of less than 0.5% at the operating wavelength of light source 110. By way of example, if light source 110 has an operating wavelength of about 1550nm, surface a and surface B may each have an AR coating with a reflectivity of less than 0.5% from about 1547nm to about 1553 nm. In another embodiment, both surface a and surface B have AR coatings with a reflectivity of less than 1% at the operating wavelength of light source 110. For example, if the housing 155 surrounds two sensor heads with respective light sources, the first light source emitting pulses at a wavelength of about 1535nm and the second light source emitting pulses at a wavelength of about 1540nm, both surface a and surface B may have AR coatings with less than 1% reflectivity from about 1530nm to about 1545 nm.
The window 157 may have a light transmittance that is greater than any suitable value for the one or more wavelengths of the one or more light sources 110 contained within the housing 155. As an example, the window 157 may have a light transmittance of greater than or equal to 70%, 80%, 90%, 95%, or 99% at the wavelength of the light source 110. In one exemplary embodiment, the window 157 may transmit greater than or equal to 95% of light at the operating wavelength of the light source 110. In another embodiment, the window 157 transmits greater than or equal to 90% of light at the operating wavelength of the light source enclosed within the housing 155.
Surface a or surface B may have a dichroic coating that is anti-reflective at one or more operating wavelengths of one or more light sources 110 and highly reflective at wavelengths away from the one or more operating wavelengths. For example, surface a may have an AR coating for the operating wavelength of light source 110, and surface B may have a dichroic coating that is AR at the light source operating wavelength and HR for wavelengths away from the operating wavelength. A coating that is HR for wavelengths away from the operating wavelength of the light source may prevent most of the incident light at the unwanted wavelengths from being transmitted through the window 117. In one embodiment, if light source 110 emits optical pulses having a wavelength of about 1550nm, surface A may have an AR coating with a reflectivity of less than or equal to 0.5% from about 1546nm to about 1554 nm. In addition, surface B may have a dichroic coating that is AR at about 1546-1554nm and HR (e.g., greater than or equal to 90% reflectance) at about 800-1500nm and about 1580-1700 nm.
Surface B of window 157 may include an oleophobic, hydrophobic, or hydrophilic coating. The oleophobic (or lipophobic) coating can repel oil (e.g., fingerprint oil or other non-polar material) from the outer surface (surface B) of the window 157. The hydrophobic coating may repel water from the outer surface. For example, surface B may be coated with a material that is both oleophobic and hydrophobic. Hydrophilic coatings attract water, so water may tend to wet and form a film on hydrophilic surfaces (rather than bead on hydrophobic surfaces). If the surface B has a hydrophilic coating, water (e.g., from rain) falling on the surface B may form a film on the surface. A thin surface film of water may result in less distortion, deflection, or absorption of the output beam 125 than a surface with a non-hydrophilic coating or a hydrophobic coating.
With continued reference to fig. 1, the light source 110 may include a pulsed laser configured to generate or emit pulses of light having a pulse duration. In one exemplary embodiment, the pulse duration or pulse width of the pulsed laser is about 10 picoseconds (ps) to 100 nanoseconds (ns). In another embodiment, light source 110 is a pulsed laser that produces pulses having a pulse duration of about 1-4 ns. In yet another embodiment, the light source 110 is a pulsed laser that generates pulses at a pulse repetition rate of about 100kHz to 5MHz or a pulse period (e.g., time between successive pulses) of about 200ns to 10 μ s. Depending on the implementation, the light source 110 may have a substantially constant or variable pulse repetition frequency. By way of example, the light source 110 may be a pulsed laser that generates pulses (e.g., 640,000 pulses per second) at a substantially constant pulse repetition rate of about 640kHz, corresponding to a pulse period of about 1.56 μ s. As another example, the light source 110 may have a pulse repetition frequency that may vary between approximately 500kHz to 3 MHz. As used herein, a pulse of light may be referred to as an optical pulse, a pulse of light, or a pulse, and the pulse repetition frequency may be referred to as a pulse rate.
In general, the output beam 125 may have any suitable average optical power, and the output beam 125 may include optical pulses having any suitable pulse energy or peak optical power. Some examples of average power of output beam 125 include approximations of 1mW, 10mW, 100mW, 1W, and 10W. Example values of the pulse energy of output beam 125 include approximate values of 0.1 μ J, 1 μ J, 10 μ J, 100 μ J, and 1 mJ. Examples of peak power values of the pulses included in the output beam 125 are approximations of 10W, 100W, 1kW, 5kW, 10 kW. The peak power of an exemplary optical pulse having a duration of 1ns and a pulse energy of 1 muj is about 1 kW. If the pulse repetition frequency is 500kHz, the average power of the output beam 125 with 1 uJ pulses is about 0.5W in this example.
The light source 110 may include a laser diode, such as a fabry-perot laser diode, a quantum well laser, a Distributed Bragg Reflector (DBR) laser, a Distributed Feedback (DFB) laser, or a Vertical Cavity Surface Emitting Laser (VCSEL). The laser diodes operated in the light source 110 may be aluminum gallium arsenide (AlGaAs) laser diodes, indium gallium arsenide (InGaAs) laser diodes, or indium gallium arsenide phosphide (InGaAsP) laser diodes, or any other suitable diodes. In some embodiments, the light source 110 comprises a pulsed laser diode having a peak emission wavelength of about 1400-1600 nm. In addition, the light source 110 may include a laser diode that is current modulated to generate optical pulses.
In some embodiments, the light source 110 comprises a pulsed laser diode followed by one or more optical amplification stages. For example, the optical source 110 may be a fiber laser module comprising a current-modulated laser diode having a peak wavelength of about 1550nm followed by a single or multi-stage Erbium Doped Fiber Amplifier (EDFA). As another example, the optical source 110 may comprise a Continuous Wave (CW) or quasi-CW laser diode followed by an external optical modulator (e.g., an electro-optic modulator), and the output of the modulator may be fed into an optical amplifier. In other embodiments, the light source 110 may include a laser diode that produces optical pulses that are not amplified by an optical amplifier. As an example, a laser diode (which may be referred to as a direct emitter or direct emitter laser diode) may emit an optical pulse that forms an output beam 125 directed forward from laser radar system 100. In other embodiments, the light source 110 may include a pulsed solid state laser or a pulsed fiber laser.
Although this disclosure describes or illustrates example embodiments of a lidar system or light source that produces a light waveform that includes pulses of light, the embodiments described or illustrated herein may also be applied to other types of light waveforms, including Continuous Wave (CW) light or modulated light waveforms. For example, the lidar systems described or illustrated herein may include a light source configured to generate pulses of light. Alternatively, the lidar system may be configured to operate as a Frequency Modulated Continuous Wave (FMCW) lidar system, and may include a light source configured to generate CW light or a frequency modulated light waveform.
Pulsed lidar systems are a type of lidar system in which a light source emits a pulse of light, and the distance to a remote target is determined from the time of flight of the light pulse to travel to the target and back. Another type of lidar system is a frequency modulated lidar system, which may be referred to as a Frequency Modulated Continuous Wave (FMCW) lidar system. FMCW lidar systems use modulated light to determine a range to a remote target based on the modulation frequency of the received light (scattered from the remote target) relative to the modulation frequency of the emitted light. For example, for a linearly chirped light source (e.g., frequency modulation that produces a linear change in frequency over time), the greater the frequency difference between the transmitted and received light, the further away the target location. The frequency difference may be determined by mixing the received light with a portion of the emitted light (e.g., by coupling two beams of light to a detector, or by mixing analog electrical signals corresponding to the received and emitted light) and determining the resulting beat frequency. For example, the electrical signals from the APDs may be analyzed using Fast Fourier Transform (FFT) techniques to determine the frequency difference between the transmitted and received light.
If a linear frequency modulation m (e.g. in Hz/s) is applied to the CW laser, the distance D from the target to the lidar system can be expressed as D ═ c · Δ f/(2m), where c is the speed of light and Δ f is the frequency difference between the transmitted and received light. For example, for 1012Linear frequency modulation in Hz/s (or 1 MHz/mus), if a frequency difference of 330kHz is measured, the distance to the target is about 50 meters. Furthermore, a frequency difference of 1.33MHz corresponds to a target located about 200 meters away.
The light source for an FMCW lidar system may be a fiber laser (e.g., a seed laser diode followed by one or more optical amplifiers) or a direct emitting laser diode. The seed laser diode or direct emitting laser diode may be operated in CW mode (e.g., by driving the laser diode with a substantially constant DC current), and the frequency modulation may be provided by an external modulator (e.g., an electro-optic phase modulator). Alternatively, the frequency modulation may be generated by applying a DC bias current to the seed laser diode or the direct emitting laser diode together with the current modulation. The current modulation produces a corresponding refractive index modulation in the laser diode, thereby frequency modulating the light emitted by the laser diode. The current modulation component (and corresponding frequency modulation) may have any suitable frequency or shape (e.g., piecewise linear, sinusoidal, triangular, or sawtooth).
In some embodiments, the output beam 125 emitted by the light source 110 is a collimated beam having any suitable beam divergence, such as a divergence of about 0.1 to 3.0 milliradians (mrads). Divergence of output beam 125 may refer to an angular measure of an increase in beam size (e.g., beam radius or beam diameter) as output beam 125 travels away from light source 110 or laser radar system 100. Output beam 125 may have a substantially circular cross-section with a beam divergence characterized by a single divergence value. For example, an output beam 125 having a circular cross-section and a divergence of 1mrad may have a beam diameter or spot size of about 10cm at a distance of 100m from the lidar system 100. In some embodiments, the output beam 125 beam may be an astigmatic beam or may have a substantially elliptical cross-section and may be characterized by two divergence values. As an example, output beam 125 may have a fast axis and a slow axis, where the fast axis divergence is greater than the slow axis divergence. As another example, the output beam 125 can be an astigmatic beam having a fast axis divergence of 2mrad and a slow axis divergence of 0.5 mrad.
The output light beam 125 emitted by the light source 110 may be unpolarized or randomly polarized, may have no particular or fixed polarization (e.g., the polarization may vary over time), or may have a particular polarization (e.g., the output light beam 125 may be linearly polarized, elliptically polarized, or circularly polarized). As an example, light source 110 may generate linearly polarized light, and lidar system 100 may include a quarter wave plate that converts the linearly polarized light to circularly polarized light. Lidar system 100 may transmit circularly polarized light as output beam 125 and receive input beam 135, where input beam 135 may be substantially or at least partially circularly polarized in the same manner as output beam 125 (e.g., if output beam 125 is right-handed circularly polarized, input beam 135 may also be right-handed circularly polarized). The input light beam 135 may pass through the same quarter wave plate (or a different quarter wave plate) resulting in the input light beam 135 being converted to linearly polarized light that is orthogonally polarized (e.g., polarized at a right angle) relative to the linearly polarized light produced by the light source 110. As another example, laser radar system 100 may employ polarization diversity detection, in which two polarization components are detected separately. Output beam 125 may be linearly polarized, and lidar system 100 may split input beam 135 into two polarization components (e.g., s-polarized and p-polarized) that are detected by two photodiodes (e.g., an optical receiver including two photodiodes), respectively.
With continued reference to fig. 1, output beam 125 and input beam 135 may be substantially coaxial. In other words, output beam 125 and input beam 135 may at least partially overlap or share a common propagation axis such that input beam 135 and output beam 125 travel along substantially the same optical path (although in opposite directions). As lidar system 100 scans output beam 125 over the field of view, input beam 135 may follow along output beam 125, thereby maintaining a coaxial relationship between the two beams.
Lidar system 100 may also include one or more optical components configured to condition, shape, filter, modify, manipulate, or direct output beam 125 and/or input beam 135. For example, laser radar system 100 may include one or more lenses, mirrors, filters (e.g., bandpass or interference filters), beam splitters, polarizers, polarizing beam splitters, waveplates (e.g., half-wave or quarter-wave plates), diffractive elements, or holographic elements. In some embodiments, lidar system 100 includes a telescope, one or more lenses, or one or more mirrors to expand, focus, or collimate output beam 125 to a desired beam diameter or divergence. As an example, laser radar system 100 may include one or more lenses to focus input beam 135 onto an active area of receiver 140. As another example, laser radar system 100 may include one or more flat or curved mirrors (e.g., concave, convex, or parabolic mirrors) to steer or focus output beam 125 or input beam 135. For example, lidar system 100 may include an off-axis parabolic mirror to focus input beam 135 onto the active area of receiver 140. As shown in fig. 1, lidar system 100 may include a mirror 115, which may be a metal mirror or a dielectric mirror. The mirror 115 may be configured such that the light beam 125 passes through the mirror 115. By way of example, the mirror 115 may include a hole, slot, or aperture through which the output light beam 125 passes. As another example, the mirror 115 may be configured such that at least 80% of the output beam 125 passes through the mirror 115 and at least 80% of the input beam 135 is reflected by the mirror 115. In some embodiments, mirror 115 may substantially co-axially align output beam 125 and input beam 135 such that beams 125 and 135 travel along substantially the same optical path in opposite directions.
Although the assembly 113 in this example embodiment includes an overlapping mirror 115 through which the output beam 125 travels from the light source 110 toward the scanner 120, the assembly 113 may generally include a mirror without an aperture such that the output beam 125 travels through the mirror 115, for example, according to off-axis illumination techniques. More generally, assembly 113 may include any suitable optical elements to direct output beam 125 toward scanner 120 and input beam 135 toward receiver 140.
In general, the scanner 120 steers the output beam 125 in one or more forward directions. The scanner 120 can include, for example, one or more scanning mirrors and one or more actuators that drive the mirrors to rotate, tilt, pivot, or move the mirrors, for example, in an angular manner about one or more axes. For example, a first mirror of the scanner may scan the output beam 125 in a first direction, and a second mirror may scan the output beam 125 in a second direction substantially orthogonal to the first direction. An example implementation of the scanner 120 is discussed in more detail below with reference to fig. 2.
The scanner 120 may be configured to scan the output beam 125 over a 5 degree angle range, a 20 degree angle range, a 30 degree angle range, a 60 degree angle range, or any other suitable angle range. FOR example, the scanning mirror may be configured to periodically rotate over a 15 degree range, which results in the output beam 125 scanning over a 30 degree range (e.g., a theta degree rotation of the scanning mirror results in a 2 theta degree angular scan of the output beam 125. the field of view (FOR) of the lidar system 100 may refer to an area, or angular range over which the lidar system 100 may be configured to scan or capture range information. when the lidar system 100 scans the output beam 125 over a 30 degree scan range, the lidar system 100 may be referred to as having a 30 degree angular field of view. as another example, the lidar system 100 having a scanning mirror that rotates over a 30 degree range may produce an output beam 125 (e.g., a 60 degree FOR) that scans over a 60 degree range. in various embodiments, the lidar system 100 may have a FOR of about 10, 20, 40, 60, 120, or any other suitable FOR. FOR may also be referred to as a scan area.
Scanner 120 may be configured to scan output beam 125 horizontally and vertically, and laser radar system 100 may have a particular FOR along the horizontal direction and another particular FOR along the vertical direction. FOR example, lidar system 100 may have a horizontal FOR of 10 ° to 120 ° and a vertical FOR of 2 ° to 45 °.
One or more of the scan mirrors of scanner 120 can be communicatively coupled to controller 150, and controller 150 can control the scan mirrors to direct output beam 125 in a desired direction, or along a desired scan pattern. In general, the scan pattern may refer to a pattern or path along which the output beam 125 is directed, and may also be referred to as an optical scan pattern, an optical scan path, or a scan path. As an example, the scanner 120 may include two scanning mirrors configured to scan the output beam 125 over a 60 ° horizontal FOR and a 20 ° vertical FOR. The two scan mirrors can be controlled to follow a scan path that substantially covers 60 ° x 20 ° FOR. Laser radar system 100 may use the scan path to generate a point cloud whose pixels substantially cover 60 ° x 20 ° FOR. The pixels may be substantially uniformly distributed over 60 ° x 20 ° FOR. Alternatively, the pixels may have a particular non-uniform distribution (e.g., the pixels may be distributed over all or a portion of the 60 ° × 20 ° FOR, and the pixels may have a higher density in one or more particular regions of the 60 ° × 20 ° FOR).
In operation, light source 110 may emit pulses of light that scanner 120 scans over the FOR of laser radar system 100. Target 130 may scatter one or more of the transmitted pulses, and receiver 140 may detect at least a portion of the pulses of light scattered by target 130.
The receiver 140 may be referred to as (or may include) an optical receiver, an optical sensor, a detector, a light detector, or an optical detector. In some embodiments, the receiver 140 receives or detects at least a portion of the input light beam 135 and generates an electrical signal corresponding to the input light beam 135. For example, if the input beam 135 includes optical pulses, the receiver 140 may generate current or voltage pulses corresponding to the optical pulses detected by the receiver 140. In an example embodiment, the receiver 140 includes one or more Avalanche Photodiodes (APDs) or one or more Single Photon Avalanche Diodes (SPADs). In another embodiment, receiver 140 includes one or more PN photodiodes (e.g., a photodiode structure formed of a p-type semiconductor and an n-type semiconductor) or one or more PIN photodiodes (e.g., a photodiode structure formed of an undoped intrinsic semiconductor region located between p-type and n-type regions).
The receiver 140 may have an active region or avalanche multiplication region comprising silicon, germanium, or InGaAs. The active area of the receiver 140 may have any suitable size, for example, a diameter or width of about 50-500 μm. Receiver 140 may include circuitry to perform signal amplification, sampling, filtering, signal conditioning, analog-to-digital conversion, time-to-digital conversion, pulse detection, threshold detection, rising edge detection, or falling edge detection. For example, the receiver 140 may include a transimpedance amplifier that converts a received photocurrent (e.g., a current generated by an APD in response to a received optical signal) into a voltage signal. The receiver 140 may direct the voltage signal to a pulse detection circuit that generates an analog or digital output signal 145, the analog or digital output signal 145 corresponding to one or more characteristics (e.g., rising edge, falling edge, amplitude, or duration) of the received optical pulse. For example, the pulse detection circuit may perform a time-to-digital conversion to produce the digital output signal 145. The receiver 140 may send the electrical output signal 145 to the controller 150 for processing or analysis, for example, to determine a time-of-flight value corresponding to the received optical pulse.
The controller 150 may be electrically or otherwise communicatively coupled to one or more of the light source 110, the scanner 120, and the receiver 140. The controller 150 may receive electrical trigger pulses or edges from the light source 110, where each pulse or edge corresponds to the emission of an optical pulse by the light source 110. The controller 150 may provide instructions, control signals, or trigger signals to the light source 110 indicating when the light source 110 should generate optical pulses. For example, the controller 150 may send an electrical trigger signal comprising electrical pulses, wherein the light source 110 emits an optical pulse in response to each electrical pulse. Further, the controller 150 may cause the optical source 110 to adjust one or more of the frequency, period, duration, pulse energy, peak power, average power, or wavelength of the optical pulses generated by the optical source 110.
The controller 150 may determine the time-of-flight value of the optical pulse based on timing information associated with: when the pulse is emitted by the light source 110, and when a portion of the pulse (e.g., the input light beam 135) is detected or received by the receiver 140. The controller 150 may include circuitry to perform signal amplification, sampling, filtering, signal conditioning, analog-to-digital conversion, time-to-digital conversion, pulse detection, threshold detection, rising edge detection, or falling edge detection.
Controller 150 may also receive signals from one or more sensors 158, which in various embodiments are internal to lidar system 100, or external to lidar system 100 as shown in fig. 1. The one or more sensors 158 may include a camera (e.g., a CCD or CMOS digital camera), a microphone or microphone array, radar, or the like. In some embodiments, lidar system 100 uses signals from one or more sensors 158 to determine which portion of the field of view overlaps the ground in front of the lidar system.
As described above, lidar system 100 may be used to determine a range to one or more forward targets 130. By scanning lidar system 100 across the field of view, the system can be used to map distances to many points within the field of view. Each of these depth mapped points may be referred to as a pixel or a voxel. A set of pixels (which may be referred to as a depth map, point cloud, or frame) captured in succession may be rendered as an image, or may be analyzed to identify or detect objects or to determine the shape or distance of objects within a FOR. For example, the depth map may cover an energy view extending 60 ° horizontally and 15 ° vertically, and the depth map may include a frame of 100-2000 pixels in the horizontal direction multiplied by 4-400 pixels in the vertical direction.
Lidar system 100 may be configuredThe capturing or generating of the point cloud of the field of view is arranged to be repeated at any suitable frame rate between about 0.1 Frames Per Second (FPS) and about 1,000 FPSs. For example, laser radar system 100 may generate point clouds at a frame rate of approximately 0.1FPS, 0.5FPS, 1FPS, 2FPS, 5FPS, 10FPS, 20FPS, 100FPS, 500FPS, or 1,000 FPS. In an example embodiment, lidar system 100 is configured at 5 × 105The rate of pulses/second generates optical pulses (e.g., the system can determine a 500,000 pixel distance per second) and scans a frame of 1000 x 50 pixels (e.g., 50,000 pixels/frame), which corresponds to a point cloud frame rate of 10 frames per second (e.g., 10 point clouds per second). The point cloud frame rate may be substantially fixed or dynamically adjustable, depending on the implementation. For example, laser radar system 100 may capture one or more point clouds at a particular frame rate (e.g., 1 hertz) and then switch to capturing the one or more point clouds at a different frame rate (e.g., 10 hertz). In general, a lidar system may use a slower frame rate (e.g., 1Hz) to capture one or more high resolution point clouds and a faster frame rate (e.g., 10Hz) to rapidly capture a plurality of lower resolution point clouds.
The field of view of lidar system 100 may overlap, encompass, or surround at least a portion of target 130, which target 130 may include all or part of an object that is moving or stationary relative to lidar system 100. For example, the target 130 may include all or part of the following: a person, a vehicle, a motorcycle, a truck, a train, a bicycle, a wheelchair, a pedestrian, an animal, a road sign, a traffic light, a lane sign, a pavement sign, a parking space, a tower, a guard rail, a traffic obstacle, a pothole, a railroad crossing, an obstacle in or near a road, a curb, a vehicle parked on or off a road, a utility pole, a house, a building, a trash can, a mailbox, a tree, any other suitable object, or a combination of all or part of any suitable two or more objects.
Referring now to fig. 2, a scanner 162 and a receiver 164 may operate as scanner 120 and receiver 140, respectively, in the lidar system of fig. 1. More generally, the scanner 162 and receiver 164 may operate in any suitable lidar system.
Scanner 162 may include any suitable number of mirrors driven by any suitable number of mechanical actuators. For example, the scanner 162 may include a galvanometer scanner, a resonant scanner, a piezoelectric actuator, a polygon scanner, a rotating prism scanner, a voice coil motor, a DC motor, a brushless DC motor, a stepper motor, or a micro-electromechanical system (MEMS) device, or any other suitable actuator or mechanism.
A galvanometer scanner (which may also be referred to as a galvanometer actuator) may include a galvanometer-based scanning motor having a magnet and a coil. When current is supplied to the coil, a rotational force will be applied to the magnet, which causes the mirror attached to the galvanometer scanner to rotate. The current supplied to the coil can be controlled to dynamically change the position of the galvanometer mirror. A resonant scanner, which may be referred to as a resonant actuator, may include a spring-like mechanism driven by the actuator to produce periodic oscillations at a substantially fixed frequency (e.g., 1 kHz). MEMS-based scanning devices may include a mirror between about 1mm and 10mm in diameter, where the mirror is rotated using electromagnetic or electrostatic actuation. A voice coil motor (which may be referred to as a voice coil actuator) may include a magnet and a coil. When current is supplied to the coil, a translational force will be applied to the magnet, which causes the mirror attached to the magnet to move or rotate.
In an example embodiment, the scanner 162 includes a single mirror configured to scan the output beam 170 in a single direction (e.g., the scanner 162 may be a one-dimensional scanner that scans in a horizontal or vertical direction). The mirror may be a planar scanning mirror attached to a scanner actuator or mechanism that scans the mirror over a particular range of angles. The mirror may be driven by one actuator (e.g., a galvanometer) or two actuators configured to drive the mirror in a push-pull configuration. When two actuators drive the mirror in one direction in a push-pull configuration, the actuators may be located at opposite ends or sides of the mirror. The actuators may operate in a coordinated manner such that when one actuator pushes the mirror, the other actuator pulls the mirror, and vice versa. In another example embodiment, two voice coil actuators arranged in a push-pull configuration drive the mirror in either a horizontal or vertical direction.
In some implementations, the scanner 162 can include one mirror configured to be scanned along two axes, with two actuators arranged in a push-pull configuration providing motion along each axis. For example, two resonant actuators arranged in a horizontal push-pull configuration may drive the mirror in a horizontal direction, and another pair of resonant actuators arranged in a vertical push-pull configuration may drive the mirror in a vertical direction. In another example embodiment, two actuators scan the output beam 170 in two directions (e.g., horizontal and vertical), where each actuator provides rotational motion in a particular direction or about a particular axis.
The scanner 162 can also include a mirror driven by two actuators configured to scan the mirror along two substantially orthogonal directions. For example, a resonant actuator or a galvanometer actuator may drive one mirror in a substantially horizontal direction, and a galvanometer actuator may drive the mirror in a substantially vertical direction. As another example, two resonant actuators may drive the mirror in two substantially orthogonal directions.
In some embodiments, the scanner 162 includes two mirrors, one of which scans the output beam 170 in a substantially horizontal direction and the other of which scans the output beam 170 in a substantially vertical direction. In the example of FIG. 2, scanner 162 includes two mirrors, mirror 180-1 and mirror 180-2. Mirror 180-1 may scan output beam 170 in a substantially horizontal direction and mirror 180-2 may scan output beam 170 in a substantially vertical direction (or vice versa). Mirror 180-1 or mirror 180-2 may be a flat mirror, a curved mirror, or a multi-faceted mirror having two or more reflective surfaces.
The scanner 162 in other embodiments includes two galvanometer scanners that drive respective mirrors. For example, scanner 162 can include a galvanometer actuator that scans mirror 180-1 in a first direction (e.g., vertical), and scanner 162 can include another galvanometer actuator that scans mirror 180-2 in a second direction (e.g., horizontal). In yet another embodiment, the scanner 162 includes two mirrors, with a galvanometer actuator driving one mirror and a resonant actuator driving the other mirror. For example, a galvanometer actuator can scan mirror 180-1 along a first direction and a resonant actuator can scan mirror 180-2 along a second direction. The first and second scanning directions may be substantially orthogonal to each other, e.g. the first direction may be substantially vertical and the second direction may be substantially horizontal. In yet another embodiment, the scanner 162 includes two mirrors, one of which is a polygon mirror that is rotated in one direction (e.g., clockwise or counterclockwise) by a motor (e.g., a brushless DC motor). For example, mirror 180-1 may be a polygon mirror that scans output beam 170 in a substantially horizontal direction, and mirror 180-2 may scan output beam 170 in a substantially vertical direction. The polygon mirror may have two or more reflective surfaces, and the polygon mirror may continuously rotate in one direction such that the output beam 170 is reflected from each reflective surface in turn. The polygon mirror may have a cross-sectional shape corresponding to a polygon, wherein each side of the polygon has a reflective surface. For example, a polygon mirror having a square cross-sectional shape may have four reflective surfaces, while a polygon mirror having a pentagonal cross-sectional shape may have five reflective surfaces.
To direct the output beam 170 along a particular scan pattern, the scanner 162 can include two or more actuators that synchronously drive a single mirror. For example, two or more actuators may synchronously drive the mirror in two substantially orthogonal directions to cause the output beam 170 to follow a scan pattern having a substantially straight line. In some embodiments, the scanner 162 can include two mirrors and an actuator that synchronously drives the two mirrors to generate a scan pattern that includes a substantially straight line. For example, a galvanometer actuator may drive mirror 180-2 in a substantially linear back-and-forth motion (e.g., the galvanometer may be driven in a substantially sinusoidal or triangular waveform), which causes output beam 170 to track a substantially horizontal back-and-forth pattern, and another galvanometer actuator may scan mirror 180-1 in a substantially vertical direction. The two galvanometers may be synchronized such that for every 64 horizontal traces, output beam 170 forms a single trace along the vertical direction. Whether one or two mirrors are used, the substantially straight line may be directed substantially horizontally, vertically, or in any other suitable direction.
The scanner 162 may also apply dynamically adjusted deflections (e.g., with galvanometer actuators) in the vertical direction to achieve a straight line as the output beam 170 is scanned in a substantially horizontal direction (e.g., with galvanometer or resonant actuators). If no vertical deflection is applied, the output beam 170 may trace a curved path as it scans from side to side. In some embodiments, the scanner 162 uses a vertical actuator to apply a dynamically adjusted vertical deflection as the output beam 170 is scanned horizontally, and a discrete vertical offset between each horizontal scan (e.g., to step the output beam 170 to a subsequent line of the scan pattern).
With continued reference to fig. 2, in this example embodiment, overlapping mirror 190 is configured to overlap input beam 172 and output beam 170 such that beams 170 and 172 are substantially coaxial. In fig. 2, overlapping mirror 190 includes a hole, slit, or aperture 192 through which output beam 170 passes, and a reflective surface 194 that reflects at least a portion of input beam 172 toward receiver 164. The overlap mirror 190 may be oriented such that the input beam 172 and the output beam 170 at least partially overlap.
In some embodiments, the overlapping mirror 190 may not include the hole 192. For example, output beam 170 may be directed through one side of mirror 190 instead of through aperture 192. The output beam 170 may pass along the edge of the mirror 190 and may be oriented at a slight angle with respect to the direction of the input beam 172. As another example, overlapping mirror 190 may include a small reflective portion configured to reflect output beam 170, and other portions of overlapping mirror 190 may have an AR coating configured to transmit input beam 172.
The input beam 172 may pass through a lens 196 that focuses the beam onto the active area 166 of the receiver 164. The active area 166 may refer to an area over which the receiver 164 may receive or detect input light. The active area may have any suitable size or diameter d, for example, a diameter of about 25 μm, 50 μm, 80 μm, 100 μm, 200 μm, 500 μm, 1mm, 2mm, or 5 mm. The overlapping mirror 190 may have a substantially flat reflective surface 194 or the reflective surface 194 may be curved (e.g., the mirror 190 may be an off-axis parabolic mirror configured to focus the input light beam 172 onto the active area of the receiver 140).
The holes 192 may have any suitable size or diameter Φ1And the input beam 172 may have any suitable size or diameter Φ2Wherein phi2Greater than phi1. For example, the opening 192 may have a diameter Φ of about 0.2mm, 0.5mm, 1mm, 2mm, 3mm, 5mm, or 10mm1While the input beam 172 may have a diameter Φ of about 2mm, 5mm, 10mm, 15mm, 20mm, 30mm, 40mm, or 50mm2. In some embodiments, the reflective surface 194 of the overlapping mirror 190 may reflect 70% or more of the input light beam 172 toward the receiver 164. For example, if the reflective surface 194 has a reflectivity R at the operating wavelength of the light source 160, the portion of the input light beam 172 directed toward the receiver 164 may be represented as R × [1- (Φ)12)2]. As a more specific example, if R is 95%, Φ1Is 2mm and phi2At 10mm, approximately 91% of the input light beam 172 may be directed by the reflective surface 194 toward the receiver 164.
Fig. 3 shows a lidar system 200, the lidar system 200 including a polygon mirror 202 driven by a motor 204. Lidar system 200 operates in a binocular configuration with a first eye 206A and a second eye 206B. The first eye 206A includes a collimator 210A, a scanning mirror 212A, and a receiver 214A, while the second eye 206B includes a collimator 210B, a scanning mirror 212B, and a receiver 214B. The polygon mirror 202 may be in the form of a rotatable block having a plurality of reflective surfaces that are angularly offset from one another along a polygonal periphery of the rotatable block. In this example embodiment, the polygon mirror 202 has six reflective surfaces 220A, 220B, … 220F; however, the polygon mirror 202 may generally include any suitable number of surfaces, e.g., three, four, five, eight, etc. The motor 204 transmits the rotation to the rotatable polygon mirror 202. The scanning mirrors 212A and 212B are configured to rotate in an oscillating manner about respective axes within a certain angular range, the axes being orthogonal to the rotational axis of the polygon mirror 202.
The light source 222 may be a fiber laser including a seed laser diode. The output of the light source 222 may be provided to the collimators 210A and 210B via fiber optic cables 224A and 224B, free space coupling, or in any other suitable manner. Although lidar system 200 uses a collimator coupled to a shared light source, in other embodiments of the system, each eye may include its own direct-emitting laser diode. In this case, the light source 222 may be made of a plurality of direct-emitting laser diodes (e.g., high-power laser diodes) that directly emit pulses without optical amplification. The laser diodes may be accommodated in respective sensor heads.
In operation, collimators 210A and 210B direct output beams 226A and 226B to scan mirrors 212A and 212B, respectively. The scanning mirrors 212A and 212B then reflect these beams toward non-adjacent reflective surfaces of the polygon mirror 202, which then direct the output beams 226A and 226B to respective viewable areas. The input light beams 228A and 228B are incident on non-adjacent reflective surfaces of the polygon mirror 202 and are reflected toward the scan mirrors 212A and 212B, respectively. The input beams 228A and 228B then propagate toward the receivers 214A and 214B. In other embodiments, the input and output beams from different eyes may be incident on adjacent surfaces of the polygon mirror.
Referring again to fig. 1, depending on the number of eyes of the lidar system, in some embodiments, scanner 120 includes a polygon mirror similar to polygon mirror 202 and one or two mirrors similar to scanning mirrors 212A and 212B. The following discussion relates primarily to lidar system 100, but it should be understood that, unless explicitly stated otherwise, techniques for adjusting scan parameters of a field-of-view enabled ground portion may be implemented in lidar system 200.
Fig. 4 illustrates an exemplary configuration in which several components of laser radar system 100 or another suitable system may operate to scan a 360 degree view angle. In general, the field of view of the light sources in this configuration follows a circular trajectory and thus defines a circular scanning pattern on a two-dimensional plane. According to one embodiment, all points on the trajectory remain at the same height relative to the ground. In this case, the individual beams may follow circular trajectories with a certain vertical offset with respect to each other. In another embodiment, the points of the trajectory may define a helical scan pattern in three-dimensional space. A single beam may be sufficient to trace out the helical scan pattern, but multiple beams may be used if desired.
In the example of fig. 3, the rotational scanning module 230 is shown as rotating in one or two directions about a central axis. The motor may, for example, drive the rotating scanning module 230 at a constant speed about a central axis. The rotational scanning module 230 includes a scanner, a receiver, an overlap mirror, and the like. The components of the rotation module 230 may be similar to the scanner 120, receiver 140, and overlap mirror 115 discussed above. In some embodiments, the rotational scanning module 230 further comprises a light source and a controller. In other embodiments, the light source and/or controller is provided separately from the rotational scanning module 230 and/or exchanges optical and electrical signals with components of the rotational scanning module 230 via respective links.
The rotary scanning module 230 may include a housing 232 having a window 234. Similar to window 157 of fig. 1, window 234 may be made of glass, plastic, or any other suitable material. The window 234 allows for outward beam and return signals to pass through the housing 232. The arc length defined by the window 234 may correspond to any suitable percentage of the circumference of the housing 232. For example, the arc length may correspond to 5%, 20%, 30%, 60% or possibly even 100% of the circumference.
Referring now to FIG. 5, the rotational scanning module 236 is substantially similar to the rotational scanning module 230. However, in this embodiment, the components of the rotary scanning module 236 are disposed on a platform 237 that rotates within a stationary circular housing 238. In this embodiment, circular housing 238 is substantially transparent to light at the operating wavelength of the lidar system to pass both the inward and outward optical signals. In a sense, the circular housing 238 defines a circular window similar to window 234 and may be made of similar materials.
One type of lidar system 100 is a pulsed lidar system in which a light source 110 emits pulses of light, and the distance to a remote target 130 is determined from the time of flight for the pulses of light to travel to and return to the target 130. Another type of lidar system 100 is a frequency modulated lidar system, which may be referred to as a Frequency Modulated Continuous Wave (FMCW) lidar system. The FMCW lidar system uses modulated light to determine a distance to distant target 130 based on a modulation frequency of the received light (scattered from the remote target) relative to a modulation frequency of the transmitted light. For example, for a linearly chirped light source (e.g., frequency modulation that produces a linear change in frequency over time), the greater the frequency difference between the transmitted and received light, the further away the target 130 is located. The frequency difference may be determined by mixing the received light with a portion of the emitted light (e.g., by coupling two optical beams to an APD or by coupling analog electrical signals) and measuring the resulting beat frequency. For example, the electrical signals from the APDs may be analyzed using Fast Fourier Transform (FFT) techniques to determine the frequency difference between the transmitted and received light.
If a linear frequency modulation m (e.g. in Hz/s) is applied to the CW laser, the distance D from the target 130 to the lidar system may be expressed as D ═ c · Δ f/(2m), where c is the speed of light and Δ f is the frequency difference between the transmitted and received light. For example, for 1012Linear frequency modulation in Hz/s (or 1 MHz/mus), if a frequency difference of 330kHz is measured, the distance to the target is about 50 meters. Furthermore, a frequency difference of 1.33MHz corresponds to a target located about 200 meters away.
The light source 110 for an FMCW lidar system may be a fiber laser (e.g., a seed laser diode followed by one or more optical amplifiers) or a direct emitting laser diode. The seed laser diode or direct emitting laser diode may be operated in CW mode (e.g., by driving the laser diode with a substantially constant DC current), and the frequency modulation may be provided by an external modulator (e.g., an electro-optic phase modulator). Alternatively, the frequency modulation may be generated by applying a DC bias current to the seed laser diode or the direct emitting laser diode together with the current modulation. The current modulation produces a corresponding refractive index modulation in the laser diode, thereby frequency modulating the light emitted by the laser diode. The current modulation component (and corresponding frequency modulation) may have any suitable frequency or shape (e.g., sine wave, triangular wave, or sawtooth).
Generating pixels within a field of view of a lidar system
Fig. 6 illustrates an example scan pattern 240 that may be generated by laser radar system 100 of fig. 1. Lidar system 100 may be configured to scan output optical beam 125 along one or more scan patterns 240. In some embodiments, the scan pattern 240 corresponds to a scan pattern having any suitable horizontal FOR (FOR)H) And any suitable vertical FOR (FOR)V) Any suitable field of view (FOR). FOR example, a certain scan pattern may have an angular dimension (e.g., FOR) of 40 ° × 30 °, 90 ° × 40 °, or 60 ° × 15 °H×FORV) The energy field of view of the representation. As another example, a certain scan pattern may have a FOR of greater than or equal to 10 °, 25 °, 30 °, 40 °, 60 °, 90 °, or 120 °H. As yet another example, a certain scan pattern may have a FOR of greater than or equal to 2 °, 5 °, 10 °, 15 °, 20 °, 30 °, or 45 °V. In the example of fig. 6, reference line 246 represents the center of the field of view of scan pattern 240. Reference line 246 may have any suitable orientation, such as a horizontal angle of 0 ° (e.g., reference line 246 may be facing straight ahead) and a vertical angle of 0 ° (e.g., reference line 246 may have a slope of 0 °), or reference line 246 may have a non-zero horizontal angle or a non-zero slope (e.g., +10 ° or-10 ° vertical angle). In fig. 6, if scan pattern 240 has a field of view of 60 ° × 15 °, scan pattern 240 covers a horizontal range of ± 30 ° with respect to reference line 246 and a vertical range of ± 7.5 ° with respect to reference line 246. In addition, optical beam 125 in FIG. 6 has directions of approximately-15 horizontally and +3 vertically relative to reference line 246. Beam 125 may be said to have an azimuth angle of-15 ° and an elevation of +3 ° relative to reference line 246. The azimuth angle (which may be referred to as an azimuth angle) may represent a horizontal angle relative to the reference line 246, while the elevation angle (which may be referred to as an azimuth angle) may represent a horizontal angle relative to the reference line 246In what is referred to as a height angle, elevation angle, or elevation angle) may represent a perpendicular angle relative to reference line 246.
The scan pattern 240 may include a plurality of pixels 242, and each pixel 242 may be associated with one or more laser pulses and one or more corresponding distance measurements. The period of the scan pattern 240 may include a total of Px×PyPixel 242 (e.g., P)xMultiplying by PyA two-dimensional distribution of pixels). For example, the scan pattern 240 may include a distribution having a dimension of about 100-2,000 pixels 242 along the horizontal direction and about 4-400 pixels 242 along the vertical direction. As another example, the scan pattern 240 may include a distribution of 1,000 pixels 242 along the horizontal direction multiplied by 64 pixels 242 along the vertical direction for a total of 64,000 pixels per cycle of the scan pattern 240 (e.g., a frame size of 1000 x 64 pixels). The number of pixels 242 along the horizontal direction may be referred to as the horizontal resolution or pixel density of the scan pattern 240, and the number of pixels 242 along the vertical direction may be referred to as the vertical resolution or pixel density of the scan pattern 240. As one example, the scan pattern 240 may have a horizontal resolution greater than or equal to 100 pixels 242 and a vertical resolution greater than or equal to 4 pixels 242. As another example, the scan pattern 240 may have a horizontal resolution of 100 and 2,000 pixels 242 and a vertical resolution of 4-400 pixels 242.
Each pixel 242 may be associated with a distance (e.g., a distance to a portion of target 130 from which the corresponding laser pulse was scattered) or one or more angular values. As an example, pixel 242 may be associated with a range value and two angular values (e.g., azimuth and elevation) that represent the angular position of pixel 242 relative to laser radar system 100. The distance of a portion of the target 130 may be determined based at least in part on a measurement of the time of flight of the corresponding pulse. The angle value (e.g., azimuth or elevation) may correspond to an angle (e.g., relative to reference line 246) of output beam 125 (e.g., when a corresponding pulse is transmitted from laser radar system 100) or an angle of input beam 135 (e.g., when an input signal is received by laser radar system 100). In some implementations, lidar system 100 determines the angular value based at least in part on the position of components of scanner 120. For example, an azimuth or elevation value associated with pixel 242 may be determined from the angular position of one or more corresponding scan mirrors of scanner 120.
In some embodiments, lidar system 100 directs multiple beams of light simultaneously across the energy field of view. In the example embodiment of FIG. 7, the lidar system generates output beams 250A, 250B, 250C, … 250N, etc., each of which follows a linear scanning pattern 254A, 254B, 254C, … 254N. The number of parallel lines may be 2, 4, 12, 20 or any other suitable number. Lidar system 100 may angularly separate beams 250A, 250B, 250C, … 250N such that, for example, the separation between beams 250A and 250B may be 30cm at a distance, while the separation between the same beams 250A and 250B may be 50 cm at a longer distance.
Similar to scan pattern 240, each of linear scan patterns 254A-N includes pixels associated with one or more laser pulses and distance measurements. Fig. 7 shows example pixels 252A, 252B, and 252C along scan patterns 254A, 254B, and 254C, respectively. Lidar system 100 in this example may generate values for pixels 252A-252N simultaneously, thereby increasing the rate at which pixel values are determined.
Laser radar system 100 may output beams 250A-N at the same wavelength or different wavelengths, depending on the implementation. For example, light beam 250A may have a wavelength of 1540nm, light beam 250B may have a wavelength of 1550nm, light beam 250C may have a wavelength of 1560nm, and so forth. The number of different wavelengths used by laser radar system 100 need not match the number of beams. Thus, lidar system 100 in the example embodiment of FIG. 7 may use M wavelengths in the N beams, where 1 ≦ M ≦ N.
Next, FIG. 8 illustrates an example light source field of view (FOV) for lidar system 100L) And receiver field of view (FOV)R). When the FOV is scanned by the scanner 120 over a field of view (FOR)LAnd FOVRThe light source 110 may emit pulses of light. Light source field of viewMay refer to the angular cone illuminated by the light source 110 at a particular time. Similarly, the receiver field of view may refer to the angular cone over which the receiver 140 may receive or detect light at a particular moment in time, and any light outside the receiver field of view may not be received or detected. For example, when scanner 120 scans the light source field of view over a field of view, lidar system 100 may follow the FOV as light source 110 emits pulsesLThe pointed direction sends a pulse of light. The pulses of light may scatter off the target 130, and the receiver 140 may receive and detect along the FOVRGuided or contained in the FOVRIs a portion of the scattered light in (1).
The instantaneous FOV may refer to the cone of light illuminated by a pulse directed in the direction in which the light source FOV is directed at the instant of the pulse of emitted light. Thus, when the source FOV and detector FOV are scanned together in a synchronized manner (e.g., the scanner 120 scans the source FOV and detector FOV over the viewable area along the same scan direction and at the same scan speed, maintaining the same relative position with respect to each other), the instantaneous FOV remains "fixed" and the detector FOV effectively moves with respect to the instantaneous FOV. More specifically, when emitting a pulse of light, the scanner 120 directs the pulse in the direction in which the light source FOV is currently pointing. Each instantaneous fov (ifov) corresponds to one pixel. Thus, each time a pulse is transmitted, lidar system 100 generates or defines an IFOV (or pixel) that is fixed in position and corresponds to the source FOV at the time the pulse was transmitted. During operation of the scanner 120, the detector FOV moves relative to the light source IFOV, but not relative to the light source FOV.
In some embodiments, scanner 120 is configured to scan the light source field of view and the receiver field of view over the field of view of lidar system 100. When the scanner 120 scans the FOV over the field of viewLAnd FOVRIn time, laser radar system 100 may emit and detect multiple pulses of light while tracing out scan pattern 240. In some implementations, the scanner 120 scans the light source field of view and the receiver field of view synchronously with respect to each other. In this case, when the scanner 120 scans the FOV in scan mode 240LTime of day, FOVRFollowing the fundamental phase at the same scan speedThe same path. In addition, when the scanner 120 scans the FOV over a field of viewLAnd FOVRTime of day, FOVLAnd FOVRMay maintain the same relative position to each other. For example, the FOVLCan be aligned with the FOVRSubstantially overlapping or residing in the FOVRIn the middle (as shown in fig. 8), and the scanner 120 can maintain the FOV throughout the scanLAnd FOVRTo each other. As another example, throughout the scan, the FOVRPossible ratio FOVLLags by a particular, fixed amount (e.g. FOV)RPossibly in the opposite direction to the scanning direction, with the FOVLOffset).
The FOVLMay have an angular dimension or range ΘL,ΘLSubstantially the same as or corresponding to the divergence of the output light beam 125, and a FOVRMay have an angular dimension or range ΘR,ΘRCorresponding to the angle at which the receiver 140 can receive and detect light. The receiver field of view may be any suitable size relative to the source field of view. For example, the receiver field of view may be smaller or larger than, or of substantially the same size as, the angular extent of the source field of view. In some embodiments, the angular range of the field of view of the light source is less than or equal to 50mrad and the angular range of the field of view of the receiver is less than or equal to 50 mrad. FOV (field of View)LCan have any suitable angular range ΘLFor example, about 0.1mrad, 0.2mrad, 0.5mrad, 1mrad, 1.5mrad, 2mrad, 3mrad, 5mrad, 10mrad, 20mrad, 40mrad, or 50 mrad. Similarly, the FOVRCan have any suitable angular range ΘRFor example, about 0.1mrad, 0.2mrad, 0.5mrad, 1mrad, 1.5mrad, 2mrad, 3mrad, 5mrad, 10mrad, 20mrad, 40mrad, or 50 mrad. The source field of view and the receiver field of view may have approximately equal angular ranges. As an example, ΘLAnd ΘRBoth may be approximately equal to 1mrad, 2mrad, or 3 mrad. In some embodiments, the receiver field of view is greater than the light source field of view, or the light source field of view is greater than the receiver field of view. For example, ΘLCan be approximately equal to 1.5mrad, andΘRmay be approximately equal to 3 mrad.
The pixels 242 may represent or correspond to a light source field of view. As the output light beam 125 propagates from the light source 110, the diameter of the output light beam 125 (and the size of the corresponding pixel 242) may be in accordance with the beam divergence ΘLBut is increased. As an example, if the output beam 125 has a Θ of 2mradLThen at 100m from lidar system 100, output beam 125 may have a size or diameter of approximately 20cm, and corresponding pixel 242 may also have a corresponding size or diameter of approximately 20 cm. At a distance of 200m from lidar system 100, output beam 125 and corresponding pixel 242 may each have a diameter of approximately 40 cm.
Lidar system for operation in a vehicle
As described above, one or more lidar systems 100 may be integrated into a vehicle. In an example embodiment, multiple lidar systems 100 may be integrated into an automobile to provide a full 360 degree horizontal FOR around the automobile. As another example, 4-10 lidar systems 100 (each having a horizontal FOR of 45 degrees to 90 degrees) may be combined together to form a sensing system that provides a point cloud covering a horizontal FOR of 360 degrees. Lidar system 100 may be oriented such that adjacent FOR's have some amount of spatial or angular overlap to allow data from multiple lidar systems 100 to be combined or stitched together to form a single or continuous 360 degree point cloud. As an example, the FOR of each lidar system 100 may have an overlap of approximately 1-15 degrees with an adjacent FOR. In particular embodiments, a vehicle may refer to a mobile machine configured to transport people or goods. For example, a vehicle may include, may take the form of, or may be referred to as: an automobile, a motor vehicle, a truck, a bus, a van, a trailer, an off-road vehicle, an agricultural vehicle, a lawn mower, a construction device, a forklift, a robot, a golf cart, a recreational vehicle, a taxi, a motorcycle, a scooter, a bicycle, a skateboard, a train, a snowmobile, a watercraft (e.g., a ship or boat), an aircraft (e.g., a fixed wing aircraft, a helicopter, or a spacecraft), or a spacecraft. In particular embodiments, the vehicle may include an internal combustion engine or an electric motor that provides propulsion for the vehicle.
In some embodiments, one or more lidar systems 100 are included in a vehicle as part of an Advanced Driver Assistance System (ADAS) to assist a driver of the vehicle during driving. For example, lidar system 100 may be part of an ADAS that provides information or feedback to a driver (e.g., to alert the driver of a potential problem or hazard), or that automatically controls a portion of a vehicle (e.g., a braking system or steering system) to avoid a collision or accident. Lidar system 100 may be part of a vehicle ADAS that provides adaptive cruise control, automatic braking, automatic parking, collision avoidance, warning the driver of a dangerous or other vehicle, holding the vehicle in the correct lane, or providing a warning when an object or other vehicle is in a blind spot.
In some cases, one or more lidar systems 100 are integrated into a vehicle as part of an autonomous vehicle driving system. In an example embodiment, lidar system 100 provides information about the surrounding environment to a driving system of an autonomous vehicle. An autonomous vehicle driving system may include one or more computing systems that receive information about the surrounding environment from laser radar system 100, analyze the received information, and provide control signals to the vehicle's driving system (e.g., steering wheel, throttle, brake, or steering signals). For example, a lidar system 100 integrated into an autonomous automobile may provide a point cloud to the autonomous vehicle driving system every 0.1 seconds (e.g., the point cloud has an update rate of 10Hz, representing 10 frames per second). The autonomous vehicle driving system may analyze the received point clouds to sense or identify the targets 130 and their respective locations, distances, or speeds, and the autonomous vehicle driving system may update the control signals based on this information. As an example, if laser radar system 100 detects a leading vehicle that is decelerating or stopping, the autonomous vehicle driving system may send a command to release the throttle and apply the brakes.
Autonomous vehicles may be referred to as autonomous cars, unmanned cars, self-propelled cars, robotic cars, or unmanned vehicles. An autonomous vehicle may be a vehicle configured to sense its environment and navigate or drive with little or no manual input. For example, an autonomous vehicle may be configured to drive to any suitable location and control or perform all safety critical functions (e.g., driving, steering, braking, stopping) throughout the trip without expecting the driver to control the vehicle at any time. As another example, an autonomous vehicle may allow a driver to safely divert his attention away from driving tasks in certain environments (e.g., on a highway), or an autonomous vehicle may provide control of the vehicle in all but a few environments with little or no input or attention required by the driver.
The autonomous vehicle may be configured to drive with a driver present in the vehicle, or the autonomous vehicle may be configured to operate the vehicle without the driver present. As an example, an autonomous vehicle may include a driver seat with associated controls (e.g., a steering wheel, an accelerator pedal, and a brake pedal), and the vehicle may be configured to drive with no person seated in the driver seat or little or no input from a person seated in the driver seat. Input from a person sitting in the driver's seat. As another example, an autonomous vehicle may not include any driver seat or associated driver controls, and the vehicle may perform substantially all driving functions (e.g., driving, steering, braking, parking, and navigation) without manual input. As another example, an autonomous vehicle may be configured to operate without a driver (e.g., the vehicle may be configured to transport human passengers or cargo without a driver in the vehicle). As another example, an autonomous vehicle may be configured to operate without any human passengers (e.g., the vehicle may be configured to transport cargo without any human passengers being on the vehicle).
In some embodiments, the light source of the lidar system is located remotely from some of the other components of the lidar system (such as the scanner and the receiver). Moreover, lidar systems implemented in vehicles may include fewer light sources than scanners and receivers.
Fig. 9 shows an exemplary configuration in which laser sensor link 320 includes an optical link 330 and an electrical link 350 coupled between laser 300 and sensor 310. The laser 300 may be configured to emit pulses of light and may be referred to as a laser system, a laser head, or a light source. The laser 300 may include the light source 110 shown in fig. 1 and discussed above, may be part of the light source 110, may be similar to the light source 110, or may be substantially the same as the light source 110. Further, scanner 302, receiver 304, controller 306, and mirror 308 may be similar to scanner 120, receiver 140, controller 150, and mirror 115 discussed above. In the example of fig. 9, laser 300 is coupled to remotely located sensor 310 by laser sensor link 320 (which may be referred to as a link). The sensor 310 may be referred to as a sensor head and may include a mirror 308, a scanner 302, a receiver 304, and a controller 306. In an example embodiment, the laser 300 includes a pulsed laser diode (e.g., a pulsed DFB laser) followed by an optical amplifier, and the light from the laser 300 is transmitted by an appropriate length of optical fiber of a laser sensor link 320 to the scanner 120 in a remotely located sensor 310.
Laser sensor link 320 may include any suitable number (e.g., 0, 1, 2, 3, 5, or 10) of optical links 330 and any suitable number (e.g., 0, 1, 2, 3, 5, or 10) of electrical links 350. In the example configuration shown in fig. 9, laser sensor link 320 includes one optical link 330 from laser 300 to output collimator 340 and one electrical link 350 connecting laser 300 to controller 150. Optical link 330 may include an optical fiber (which may be referred to as a fiber optic cable or a fiber) that carries, transmits, or transmits light between laser 300 and sensor 310. The optical fiber may be, for example, a Single Mode (SM) fiber, a multimode (MM) fiber, a Large Mode Area (LMA) fiber, a Polarization Maintaining (PM) fiber, a photonic crystal or photonic bandgap fiber, a gain fiber (e.g., a rare earth doped fiber for an optical amplifier), or any suitable combination thereof. Output collimator 340 receives the optical pulses transmitted from laser 300 by optical link 330 and produces free-space optical beam 312 that includes the optical pulses. Output collimator 340 directs free-space optical beam 312 to scanner 302 through mirror 308.
Electrical link 350 may include a wire or cable (e.g., a coaxial cable or a twisted pair cable) that carries or transmits electrical power and/or one or more electrical signals between laser 300 and sensor 310. For example, the laser 300 may include a power supply or power regulator that provides electrical power to the laser 300, and additionally, the power supply or power regulator may provide electrical power to one or more components of the sensor 310 (e.g., the scanner 304, receiver 304, and/or controller 306) via one or more electrical links 350. In some embodiments, electrical link 350 may carry electrical signals that include data or information in analog or digital format. In addition, electrical link 350 may provide an interlock signal from sensor 310 to laser 300. If controller 306 detects a fault condition indicating that sensor 310 or the entire lidar system is faulty, controller 306 may change the voltage on the interlock line (e.g., from 5V to 0V), which indicates that laser 300 should shut down, stop emitting light, or reduce the power or energy of the emitted light. The fault condition may be triggered by a failure of the scanner 302, a failure of the receiver 304, or by a person or object entering within a threshold distance of the sensor 310 (e.g., within 0.1m, 0.5m, 1m, 5m, or any other suitable distance).
As described above, the lidar system may include one or more processors to determine the distance D to the target. In the embodiment shown in fig. 9, controller 306 may be located in laser 300 or in sensor 310, or portions of controller 150 may be distributed between laser 300 and sensor 310. In an exemplary embodiment, each sensor head 310 of the lidar system includes electronics (e.g., an electronic filter, a transimpedance amplifier, a threshold detector, or a time-to-digital (TDC) converter) configured to receive or process signals from the receiver 304 or from APDs or SPADs of the receiver 304. Additionally, the laser 300 may include processing electronics configured to determine a time-of-flight value or distance to a target based on signals received from the sensor head 310 via the electrical link 350.
Next, fig. 10 shows an exemplary vehicle 354 having a lidar system 351, the lidar system 351 including a laser 352, with a plurality of sensor heads 360 coupled to the laser 352 via a plurality of laser sensor links 370. In some embodiments, laser 352 and sensor head 360 may be similar to laser 300 and sensor 310 discussed above. For example, each laser sensor link 370 may include one or more optical links and/or one or more electrical links. Sensor head 360 in fig. 10 is positioned or oriented to provide a greater than 30 degree field of view of the vehicle surroundings. More generally, lidar systems having multiple sensor heads may provide a horizontal energy view around the vehicle of approximately 30 °, 45 °, 60 °, 90 °, 120 °, 180 °, 270 °, or 360 °. Each sensor head may be attached to or integrated with a bumper, fender, grille, side panel, spoiler, roof, headlight assembly, tail light assembly, rearview mirror assembly, hood, trunk, window, or any other suitable component of the vehicle.
In the example of fig. 10, four sensor heads 360 are positioned at or near the four corners of the vehicle (e.g., the sensor heads may be incorporated into a light assembly, a side panel, a bumper, or a fender), and the lasers 352 may be located within the vehicle (e.g., within or near the trunk). The four sensor heads 360 may each provide a 90 ° to 120 ° horizontal field of view (FOR), and the four sensor heads 360 may be oriented such that together they provide a full 360 degree field of view around the vehicle. As another example, lidar system 351 may include six sensor heads 360 disposed on or about the vehicle, where each sensor head 360 provides a horizontal FOR of 60 ° to 90 °. As another example, lidar system 351 may include eight sensor heads 360, and each sensor head 360 may provide a horizontal FOR of 45 ° to 60 °. In yet another example, lidar system 351 may include six sensor heads 360, where each sensor head 360 provides a 70 ° horizontal FOR, with an overlap between adjacent FOR of about 10 °. As another example, lidar system 351 may include two sensor heads 360 that together provide a horizontal FOR greater than or equal to 30 °.
The data from each sensor head 360 may be combined or stitched together to generate a point cloud covering a horizontal field of view around the vehicle greater than or equal to 30 degrees. For example, the laser 352 may include a controller or processor that receives data from each sensor head 360 (e.g., via a corresponding electrical link 370), and processes the received data to construct a point cloud covering a 360 degree horizontal field of view around the vehicle or to determine a distance to one or more targets. The point cloud or information from the point cloud may be provided to the vehicle controller 372 via a corresponding electrical, optical, or radio link 370. In some implementations, the point cloud is generated by combining data from each of the plurality of sensor heads 360 at a controller included within the laser 352 and provided to the vehicle controller 372. In other embodiments, each sensor head 360 includes a controller or process that constructs a point cloud for a portion of a 360 degree horizontal field of view around the vehicle and provides the respective point cloud to the vehicle controller 372. The vehicle controller 372 then combines or stitches together the point clouds from the respective sensor heads 360 to construct a combined point cloud covering a horizontal field of view of 360 degrees. Still further, in some embodiments, the vehicle controller 372 communicates with a remote server to process the point cloud data.
In any case, the vehicle 354 may be an autonomous vehicle, with the vehicle controller 372 providing control signals to various components 390 within the vehicle 354 to manipulate and otherwise control the operation of the vehicle 354. The assembly 390 is shown in an expanded view in fig. 10 for ease of illustration only. The components 390 may include a throttle 374, brakes 376, a vehicle engine 378, a steering mechanism 380, lights 382 (e.g., brake lights, headlights, backup lights, emergency lights, etc.), gear selection 384, and/or other suitable components to effect and control movement of the vehicle 354. The gear selector 384 may include park, reverse, neutral, drive gears, and the like. Each of the components 390 may include an interface via which the component receives commands (such as "increase speed," "decrease speed," "turn left 5 degrees," "activate turn left signal," etc.) from the vehicle controller 372 and, in some cases, provides feedback to the vehicle controller 372.
In some implementations, the vehicle controller 372 receives point cloud data from the laser 352 or sensor head 360 via link 370 and analyzes the received point cloud data to sense or identify the targets 130 and their respective locations, distances, speeds, shapes, sizes, target types (e.g., vehicle, human, tree, animal), and so forth. The vehicle controller 372 then provides control signals to the assembly 390 via the link 370 to control operation of the vehicle based on the analyzed information. For example, the vehicle controller 372 may identify an intersection based on the point cloud data and determine that the intersection is a suitable location for making a left turn. Thus, the vehicle controller 372 may provide control signals to the steering mechanism 380, throttle 374, and brake 376 to make the appropriate left turn. In another example, the vehicle controller 372 may identify a traffic light based on the point cloud data and determine that the vehicle 354 requires parking. As a result, the vehicle controller 372 may provide control signals to release the throttle 374 and apply the brake 376.
In addition to the components 390, the vehicle 354 may be equipped with a sensor and remote system interface 391, which may be communicatively coupled to the vehicle controller 372. The components 391 may include an Inertial Measurement Unit (IMU)392, a Geographic Information System (GIS) interface 304 for obtaining map data from a remote server via a communication network, a positioning unit 396 such as a Global Positioning Service (GPS) receiver, and the like. In some cases, vehicle controller 372 provides data from component 391 to lidar system 351.
Exemplary receiver implementation
Fig. 11 shows an example InGaAs Avalanche Photodiode (APD) 400. Referring back to fig. 1, receiver 140 may include one or more APDs 400 configured to receive and detect light from an input light, such as optical beam 135. More generally, APD400 may operate in any suitable input optical receiver. APD400 may be configured to detect a portion of a pulse of light scattered by a target located in a forward direction of a lidar system in which APD400 operates. For example, APD400 may receive a portion of a pulse of light scattered by target 130 shown in fig. 1 and generate a current signal corresponding to the received pulse of light.
APD400 may include doped or undoped layers of any suitable semiconductor material, such as silicon, germanium, InGaAs, InGaAsP, or indium phosphide (InP). In addition, APD400 may include an upper electrode 402 and a lower electrode 406 for coupling ADP 400 to circuitry. For example, APD400 can be electrically coupled to a voltage source that supplies a reverse bias voltage V to APD 400. In addition, APD400 may be electrically coupled to a transimpedance amplifier that receives the current generated by APD400 and produces an output voltage signal corresponding to the received current. The upper electrode 402 or the lower electrode 406 may comprise any suitable conductive material, such as a metal (e.g., gold, copper, silver, or aluminum), a transparent conductive oxide (e.g., indium tin oxide), a carbon nanotube material, or polysilicon. In some embodiments, upper electrode 402 is partially transparent or has openings to allow input light 410 to pass through to reach the active area of APD 400. In fig. 11, upper electrode 402 can have the shape of a ring that at least partially surrounds an active area of APD400, where active area refers to an area on which APD400 can receive and detect input light 410. The active area may have any suitable size or diameter d, for example a diameter of about 25 μm, 50 μm, 80 μm, 100 μm, 200 μm, 500 μm, 1mm, 2mm or 5 mm.
APD400 can include any suitable combination of any suitable semiconductor layers having any suitable doping (e.g., n-doped, p-doped, or intrinsic undoped material). In the example of fig. 11, the InGaAs APD400 includes a p-doped InP layer 420, an InP avalanche layer 422, an absorption layer 424 with n-doped InGaAs or InGaAsP, and an n-doped InP substrate layer 426. Depending on the implementation, APD400 may include separate absorption and avalanche layers, or may be a single layer that serves as both an absorption and avalanche region. APD400 can be electrically operated as a PN diode or PIN diode and during operation, APD400 can be reverse biased with a positive voltage V applied to lower electrode 406 relative to upper electrode 402. The applied bias voltage V may have any suitable value, for example, about 5V, 10V, 20V, 30V, 50V, 75V, 100V, or 200V.
In fig. 11, photons of input light 410 may be absorbed primarily in absorption layer 424, thereby generating electron-hole pairs (which may be referred to as photogenerated carriers). For example, absorption layer 424 may be configured to absorb photons corresponding to an operating wavelength of laser radar system 100 (e.g., any suitable wavelength between approximately 1400nm and approximately 1600 nm). In the avalanche layer 422, an avalanche multiplication process occurs in which carriers (e.g., electrons or holes) generated in the absorption layer 424 collide with the semiconductor lattice of the absorption layer 424 and additional carriers are generated by impact ionization. This avalanche process can be repeated multiple times so that one photogenerated carrier can result in the generation of multiple carriers. As an example, a single photon absorbed in the absorption layer 424 may result in the generation of approximately 10, 50, 100, 200, 500, 1000, 10,000, or any other suitable number of carriers through an avalanche multiplication process. The carriers generated in APD400 can produce a current that is coupled to a circuit that can perform signal amplification, sampling, filtering, signal conditioning, analog-to-digital conversion, time-to-digital conversion, pulse detection, threshold detection, rising edge detection, or falling edge detection.
The number of carriers generated from a single photogenerated carrier may increase with increasing applied reverse bias V. If the applied reverse bias V increases above a certain value, referred to as the APD breakdown voltage, a single carrier can trigger a self-sustaining avalanche process (e.g., the output of APD400 saturates regardless of the input light level). APD400 operating at or above a breakdown voltage may be referred to as a Single Photon Avalanche Diode (SPAD) and may be referred to as operating in a Geiger mode or a photon counting mode. APD400 operating below a breakdown voltage may be referred to as a linear APD, and the output current generated by APD400 may be sent to an amplifier circuit (e.g., a transimpedance amplifier). The receiver 140 (see fig. 1) may include an APD configured to operate as a SPAD and a quenching circuit configured to reduce a reverse bias voltage applied to the SPAD when an avalanche event occurs in the SPAD. APD400, configured to operate as a SPAD, may be coupled to an electron quenching circuit that reduces the applied voltage V below the breakdown voltage upon the occurrence of an avalanche detection event. Lowering the applied voltage can stop the avalanche process and then the applied reverse bias voltage can be reset to await a subsequent avalanche event. In addition, APD400 can be coupled to circuitry that generates electrical output pulses or edges when avalanche events occur.
In some embodiments, APD400 or APD400 along with a transimpedance amplifier has a Noise Equivalent Power (NEP) of less than or equal to 100 photons, 50 photons, 30 photons, 20 photons, or 10 photons. For example, APD400 may operate as a SPAD and may have a NEP of less than or equal to 20 photons. As another example, APD400 may be coupled to a transimpedance amplifier that produces an output voltage signal with an NEP of less than or equal to 50 photons. The NEP of APD400 is a metric that quantifies the sensitivity of APD400 in terms of the minimum signal (or minimum number of photons) that APD400 can detect. The NEP may correspond to an optical power (or number of photons) that produces a signal-to-noise ratio of 1, or the NEP may represent a threshold number of photons above which an optical signal may be detected. For example, if APD400 has a NEP of 20 photons, input optical beam 410 having 20 photons can be detected with a signal-to-noise ratio of approximately 1 (e.g., APD400 can receive 20 photons from input optical beam 410 and generate an electrical signal representative of input optical beam 410 having a signal-to-noise ratio of approximately 1). Similarly, an input beam 410 having 100 photons can be detected with a signal-to-noise ratio of approximately 5. In some embodiments, lidar system 100 employing APD400 (or a combination of APD400 and transimpedance amplifier) with an NEP of less than or equal to 100 photons, 50 photons, 30 photons, 20 photons, or 10 photons provides improved detection sensitivity relative to conventional lidar systems using PN or PIN photodiodes. For example, the NEP of an InGaAs PIN photodiode used in a conventional lidar system may be about 104To 105One photon of light, and useThe noise level in a lidar system using an InGaAs PIN photodiode may be 10 higher than the noise level in a lidar system 100 using an InGaAs APD detector 4003To 104And (4) doubling.
Referring back to fig. 1, an optical filter may be located in front of the receiver 140 and configured to transmit light at one or more operating wavelengths of the light source 110 and attenuate light at ambient wavelengths. For example, the optical filter may be a free-space spectral filter located in front of APD400 of fig. 11. The spectral filter may transmit light at the operating wavelength of light source 110 (e.g., between about 1530nm and 1560 nm) and attenuate light outside of this wavelength range. As a more specific example, light having a wavelength of approximately 400-1530nm or 1560-2000nm may be attenuated by any suitable amount, such as at least 5dB, 10dB, 20dB, 30dB, or 40 dB.
Next, fig. 12 shows an APD502 coupled to an example pulse detection circuit 504. APD502 may be similar to APD400 discussed above with reference to fig. 11, or may be any other suitable detector. Pulse detection circuit 504 may operate as part of receiver 140 in the lidar system of fig. 1. Further, the pulse detection circuit 504 may operate in the receiver 164 of fig. 2, the receiver 304 of fig. 9, or any other suitable receiver. Alternatively, pulse detection circuit 504 may be implemented in controller 150, controller 306, or another suitable controller. In some implementations, portions of the pulse detection circuit 504 may operate in the receiver and other portions of the pulse detection circuit 504 may operate in the controller. For example, components 510 and 512 may be part of receiver 140, while components 514 and 516 may be part of controller 150.
Pulse detection circuitry 504 may include circuitry to receive a signal (e.g., current from APD 502) from a detector and perform current-to-voltage conversion, signal amplification, sampling, filtering, signal conditioning, analog-to-digital conversion, time-to-digital conversion, pulse detection, threshold detection, rising edge detection, or falling edge detection. Pulse detection circuitry 504 may determine whether APD502 has received an optical pulse or may determine a time associated with APD502 receiving an optical pulse. In addition, the pulse detection circuit 504 may determine the duration of the received optical pulse. In an example embodiment, the pulse detection circuit 504 includes a transimpedance amplifier (TIA)510, a gain circuit 512, a comparator 514, and a time-to-digital converter (TDC) 516.
The TIA 510 may be configured to receive a current signal from the APD502 and generate a voltage signal corresponding to the received current signal. For example, in response to a received optical pulse, APD502 may generate a current pulse corresponding to the optical pulse. The TIA 510 may receive current pulses from the APD502 and generate voltage pulses corresponding to the received current pulses. The TIA 510 may also act as an electronic filter. For example, the TIA 510 may be configured as a low pass filter that removes or attenuates high frequency electronic noise by attenuating signals above a particular frequency (e.g., above 1MHz, 10MHz, 20MHz, 50MHz, 100MHz, 200MHz, or any other suitable frequency).
The gain circuit 512 may be configured to amplify the voltage signal. As an example, the gain circuit 512 may include one or more voltage amplification stages that amplify voltage signals received from the TIA 510. For example, the gain circuit 512 may receive the voltage pulse from the TIA 510, and the gain circuit 512 may amplify the voltage pulse by any suitable amount, such as by a gain of approximately 3dB, 10dB, 20dB, 30dB, 40dB, or 50 dB. In addition, the gain circuit 512 may also function as an electronic filter configured to remove or attenuate electrical noise.
The comparator 514 may be configured to receive a voltage signal from the TIA 510 or the gain circuit 512 and rise to a particular threshold voltage V when the received voltage signal risesTAbove or below a certain threshold voltage VTAn electrical-edge signal (e.g., a rising edge or a falling edge) is generated when going down. As an example, when the received voltage rises to VTAbove, the comparator 514 may generate a rising edge digital voltage signal (e.g., a signal that steps from about 0V to about 2.5V, 3.3V, 5V, or any other suitable digital high level). As another example, when the received voltage drops to VTIn the following, comparator 514 may generate the number of falling edgesA word voltage signal (e.g., a signal that goes from about 2.5V, 3.3V, 5V, or any other suitable digital high level down to about 0V). The voltage signal received by the comparator 514 may be received from the TIA 510 or the gain circuit 512, and may correspond to the current signal generated by the APD 502. For example, the voltage signal received by comparator 514 may include a voltage pulse that corresponds to a current pulse generated by APD502 in response to receiving the optical pulse. The voltage signal received by the comparator 514 may be an analog signal and the electrical edge signal generated by the comparator 514 may be a digital signal.
A time-to-digital converter (TDC)516 may be configured to receive the electrical edge signal from the comparator 514 and determine a time interval between the emission of the pulse of light by the light source and the reception of the electrical edge signal. The output of the TDC 516 may be a value corresponding to a time interval determined by the TDC 516. In some embodiments, the TDC 516 has an internal counter or clock that takes any suitable period, such as, for example, 5ps, 10ps, 15ps, 20ps, 30ps, 50ps, 100ps, 0.5ns, 1ns, 2ns, 5ns, or 10 ns. The TDC 516 may have, for example, an internal counter or clock with a 20ps period, and the TDC 516 may determine that the time interval between transmission and reception of the pulses is equal to 25,000 time periods, which corresponds to a time interval of approximately 0.5 microseconds. Referring back to fig. 1, TDC 516 may send a value of "25000" to processor or controller 150 of lidar system 100, which may include a processor configured to determine a distance from lidar system 100 to target 130 based at least in part on a time interval determined by TDC 516. The processor may receive a value (e.g., "25000") from TDC 516, and the processor may determine a distance from laser radar system 100 to target 130 based on the received value.
Exemplary fiber laser embodiments
Fig. 13 illustrates an example light source 520 that may be used as light source 110 in the lidar system of fig. 1 or a similar lidar system. The light source 520 includes a seed laser 522 and an amplifier 524. In various embodiments, the light source 520 includes one or more seed lasers 522 or one or more amplifiers 524. The seed laser 520 may include (1) a laser diode (e.g., a DFB laser) driven by a pulse generator, (2) a wavelength tunable laser configured to generate light at a plurality of wavelengths, (3) a plurality of laser diodes configured to generate light at a plurality of respective wavelengths, or (4) any other suitable laser source. The seed laser 522 may generate low power optical pulses, and the one or more optical amplifiers 524 may be configured to amplify the low power pulses to generate pulses of amplified light. The amplified pulses of light may be emitted as an output beam 125. As an example, the amplifier 524 may receive optical seed pulses having an average power greater than or equal to 1 microwatt, and the amplified output pulses from the amplifier 524 may have an average power greater than or equal to 1 mV. As another example, the amplifier 524 may receive an optical seed pulse having a pulse energy greater than or equal to 1pJ, and the amplified output pulse from the amplifier 524 may have a pulse energy greater than or equal to 0.1 μ J.
The amplifier 524 may be referred to as a fiber amplifier, an optical amplifier (optical amplifier), a fiber-optical amplifier, an optical amplifier (optical amp), or an amplifier (amp). In various embodiments, all or a portion of the amplifier 524 may be included in the light source 110. Amplifier 524 may include any suitable number of optical amplification stages. As an example, amplifier 524 may include one, two, three, four, or five optical amplification stages. In one embodiment, amplifier 524 comprises a single pass amplifier, wherein light passes through amplifier 524 at a time. In another embodiment, amplifier 524 comprises a two-pass amplifier in which light passes twice through the amplifier gain medium. In some cases, amplifier 524 may function as a pre-amplifier (e.g., an amplifier that amplifies seed pulses from laser diode or seed laser 522), an intermediate stage amplifier (e.g., an amplifier that amplifies light from another amplifier), or a boost amplifier (e.g., an amplifier that sends free-space output beam 125 to a scanner of a lidar system). The preamplifier may refer to the first amplifier in a series of two or more amplifiers, the booster amplifier may refer to the last amplifier in a series, or the interstage amplifier may refer to any amplifier located between the preamplifier and the booster amplifier.
The amplifier 524 may provide any suitable amount of optical power gain, for example, approximately 5dB, 10dB, 20dB, 30dB, 40dB, 50dB, 60dB, or 70dB of gain. As one example, the amplifier 524 (which may include two or more separate amplification stages) may receive pulses having an average power of 1 μ W and produce amplified pulses having an average power of 5W, which corresponds to an optical power gain of approximately 67 dB. As another example, the amplifier 524 may include two or more amplification stages, each having a gain greater than or equal to 20dB, which corresponds to a total gain greater than or equal to 40 dB. As another example, the amplifier 524 may include three amplification stages (e.g., a preamplifier, a mid-stage amplifier, and a booster amplifier) having gains of about 30dB, 20dB, and 10dB, respectively, which corresponds to a total gain of about 60 dB.
In light source 520, the optical fiber may transmit the optical pulse amplified by amplifier 524 to an output collimator that produces free-space optical beam 125. Optical fibers, which may be referred to as fiber optic cables, fibers, optical links, fiber optic links, or fiber optic links, may carry, propagate, or transmit light from one optical component to another. The optical fiber may comprise a Single Mode (SM) fiber, a Large Mode Area (LMA) fiber, a multimode (MM) fiber, a Polarization Maintaining (PM) fiber, a photonic crystal or photonic bandgap fiber, a gain fiber (e.g., a rare earth doped fiber used in optical amplifiers), a multi-clad fiber (e.g., a double-clad fiber having a core, an inner cladding, and an outer cladding), or any other suitable fiber, or any suitable combination thereof. For example, the optical fiber may comprise a glass SM fiber having a core diameter of about 8 μm and a cladding diameter of about 125 μm. As another example, the optical fiber may comprise a photonic crystal fiber or a photonic band gap fiber, wherein the light is limited or guided by an arrangement of holes distributed along the length of the glass fiber. In particular embodiments, one end of the optical fiber may be coupled to, attached to, or terminated at an output collimator. The output collimator may include a lens, a GRIN lens, or a fiber collimator that receives light from a fiber optic cable and produces a free-space optical beam 125.
The amplifier 524 may be implemented as shown in fig. 14. In this example embodiment, amplifier 524 includes a pump laser 530, a pump 532, and a gain fiber 534. In operation, pump laser 530 pumps, or powers, gain fiber 534.
Optically pumped gain fiber 534 provides optical gain for light of a particular wavelength traveling through gain fiber 534. Both the pump light and the light to be amplified may propagate substantially through the core of gain fiber 534. The gain fiber 534, which may be referred to as an optical gain fiber, may be a rare earth ion doped fiber, such as erbium (Er)3+) Neodymium (Nd)3+) Ytterbium (Yb)3+) Praseodymium (Pr)3+) Holmium (Ho)3+) Thulium (Tm)3+) Or any other suitable rare earth element or any suitable combination thereof. The rare earth dopant (which may be referred to as a gain material) absorbs light from the pump laser 530 and is "pumped" or promoted into an excited state that provides amplification of light of a particular wavelength by stimulated emission. The rare earth ions in the excited state may also emit photons by spontaneous emission, causing the amplifier 524 to generate Amplified Spontaneous Emission (ASE) light. In an example embodiment, amplifier 524 with erbium doped gain fiber 534 may be referred to as an Erbium Doped Fiber Amplifier (EDFA) and may be used to amplify light having a wavelength between about 1520nm and about 1600 nm. In some embodiments, gain fiber 534 is doped with a combination of erbium and ytterbium dopants, and may be referred to as an Er: yb co-doped fiber, Er: yb: glass fiber, Er: yb fiber, Er: yb doped fiber, erbium/ytterbium doped fiber, or Er/Yb gain fiber. Has Er: the Yb co-doped gain fiber amplifier 524 may be referred to as an erbium/ytterbium doped fiber amplifier (EYDFA). EYDFA can be used to amplify light having wavelengths between about 1520nm and about 1620 nm. The ytterbium-doped gain fiber 534 may be part of a ytterbium-doped fiber amplifier (YDFA). YDFA can be used to amplify light having wavelengths between about 1000nm and about 1130 nm. The thulium doped gain fiber 534 may be thulium dopedA part of a fiber amplifier (TDFA). TDFA may be used to amplify light having wavelengths between about 1900nm and about 2100 nm.
Adjusting scanning parameters based on ground detection
Fig. 15 illustrates an example scenario within a field of view 600 of a lidar system operating in a particular vehicle. The field of view 600 has a certain horizontal angular span and a certain vertical angular span. The field of view 600 overlaps with the area of the ground in front of the vehicle. The corresponding portion of the field of view (which may be referred to as the "ground portion") is schematically represented as being enclosed by a polygon 602. The scene includes a road with road markings 604, a vehicle 606 traveling about ten meters ahead of the road and in an adjacent lane to the right of the vehicle, a relatively far area 608 covered by a ground portion of the field of view (which is illuminated by the lidar system at a relatively low glancing angle), other vehicles, other objects such as trees (including objects beyond the maximum range of the lidar system, etc.). Due to the vehicle 606, the view-enabled ground portion 602 has a longer vertical angular span at the left and middle of the view-enabled ground portion 600, as shown in FIG. 15.
The term "ground" may here refer to the road on which the vehicle is travelling, in particular the part of the road directly in front of the vehicle, but also to roads located to the side and behind the vehicle and any centre line, sidewalk, road shoulder, crosswalk or bike lane located on or near the road. This may include the lane in which the vehicle is located and any adjacent lanes and shoulders. Furthermore, walls and even ceilings may have markings similar to road markings when the vehicle is driving through a tunnel. The lidar system may treat the walls of the tunnel and, in some embodiments or scenarios, the ceiling of the tunnel as part of the ground that is viewable. Thus, in one example scenario, the ground portion has a relatively long vertical angular span on the left and right sides of the viewable area to include walls (at least up to a height that can be dynamically determined or fixed at 2m, 2.5m, 3m, or any other suitable value), and a shorter vertical angular span in the middle of the viewable area.
Fig. 16 is a flow diagram of an example method 620 for adjusting one or more scan parameters when a field of view includes a ground portion, which may be implemented in a lidar system. For example, method 620 may be implemented in controller 150 (fig. 1), controller 306 (fig. 9), vehicle controller 372 (fig. 10), and/or the like. In any case, the method 620 can be implemented as a set of instructions stored on a non-transitory computer-readable medium and executable by one or more processors. For convenience, the following discussion primarily refers to lidar systems associated with steps such as detecting distance to a point on the ground, selecting pulse power, selecting scan rate, etc., but it should be understood that in some embodiments, these decisions are implemented in the controller of an autonomous vehicle responsible for other automated decisions associated with maneuvering and otherwise controlling the vehicle (e.g., vehicle controller 372 of fig. 10).
The method 620 begins at block 622, where a ground portion of a field of view is identified at block 622. To this end, the lidar system may use data collected during a previous scan or several previous scans of the field of view. In some cases, the lidar system may generate a point cloud, and a controller of the lidar system, and/or a controller of an autonomous vehicle in which the lidar system operates, may use a classifier to identify the portion of the ground that is viewable. In other cases, the lidar system may analyze the shape of the return pulse and determine that the pixels generated within a portion of the field of view may correspond to locations on the road surface or on the ground.
Alternatively, the lidar system may determine the cumulative amount of energy in a set of pulses directed toward a region of interest by: the amount of energy associated with a single pulse (which may be stored in memory as part of the configuration data, for example) is multiplied by the number of pulses in the set of pulses. The lidar system may then determine the cumulative amount of energy returned corresponding to these pulses, while eliminating statistical outliers in some cases, as described below, and determine how much of the emitted light is scattered by the region. In this way, the lidar system may obtain a measure of the approximate reflectivity of the area and determine whether the area is likely to be a ground area based on the measure. In some cases, the lidar system may eliminate statistical outliers (e.g., ranging events that do not generate return pulses and ranging events that generate high energy returns) when the ranging events described below occur alone or in a small set within a larger set of ranging events that generate low energy returns. For example, the lidar system may consider a relatively large set of pulses that define a continuous area of interest within the energy field of view (e.g., 30% of the total number of pixels in a single scan frame covering the entire energy field of view), determine that the continuous area of interest produces a low amount of scattered light, and thus has a low average reflectivity, and eliminate as outliers a single return or a small set of high energy returns that may correspond to, for example, a puddle on the road. Similarly, the lidar system may eliminate outliers corresponding to returns that lack respective ranging events in order to distinguish returns that (on the one hand) correspond to small amounts of scattered light from returns that (on the other hand) may correspond to a lack of pulses traveling toward objects placed outside of the maximum range of the lidar system.
In addition to determining which portion of the field of view overlaps the ground, the lidar system can also determine distances to points on the ground along the path of the IFOV and apply those distances in adjusting one or more scan parameters. For example, the lidar system may determine that the lowest scan line in the field of view corresponds to a point on the ground that is 20 meters away, and adjust the pulse repetition frequency for this portion of the field of view by assuming that the lidar system is 20 meters away for the maximum range of this portion.
In addition, the lidar system may use data from other sensors to identify portions of the ground that are visible. The controller of the lidar system may receive data from, for example, an external camera (e.g., a camera included in sensor 158 in fig. 1), or a vehicle controller (e.g., vehicle controller 372 of fig. 10) may provide an indication of the location of the ground relative to the controller of the lidar system.
Further, a vehicle equipped with additional sensors (e.g., the sensor and remote system interface 391 of fig. 10) may use the positioning data as well as terrain data received from a remote GIS system or stored locally to determine a distance to a point on the ground.
For clarity, referring to fig. 17, the vehicle 354 of fig. 10 may travel along a downward slope 650, and the ground portion of the field of view may occupy a relatively small percentage of the total field of view along the vertical dimension. On the other hand, as shown in fig. 18, when the vehicle 354 travels along an upward slope 660, the field-of-view ground portion may occupy a relatively large percentage of the total field-of-view along the vertical dimension. For a vehicle 354 traveling down a downward slope 650, the distances to the nearest and farthest points on the ground within the field of view are d1 and d 2; and for vehicle 354 traveling along the upward ramp 650, the distances to the nearest and farthest points on the ground within the field of view are d1 'and d 2'. As schematically shown in fig. 17 and 18, the distance may vary depending on the road topography. Furthermore, lidar systems in different vehicles may be mounted at different heights relative to the ground, which may further affect the distance to a point on the ground.
Depending on the implementation, the lidar system may identify a ground portion of the viewable area from a single delimiter, e.g., the vertical angle at which the "horizon" appears within the viewable area, i.e., the position from which the ground is to the head and the area above the ground begins without an obstacle. However, if desired, the lidar system may be configured to identify more complex boundaries of the ground portion, such as polygon 602 in fig. 15. In this case, however, the lidar system may need to identify the boundaries of the ground portion in both the vertical and horizontal dimensions. Thus, the lidar system may vary the scan parameters according to vertical and horizontal positions within the scan.
Referring again to fig. 16, flow next proceeds to block 624, where one or more scan parameters are adjusted to scan the portion of the floor that can be viewed. As described above, depending on whether the ground portion 602 of the field of view 600 or another portion of the field of view 600 is being scanned, the lidar system may adjust scanning parameters such as scan rate (which may be measured in radians per second, degrees per second, etc.), line density, pulse energy, pulse repetition frequency, and the like. These adjustments result in a modified scan pattern, a modified pulse energy distribution over the energy field of view, or both. The lidar system may thus vary scanning parameters related to the light source, the scanner, or both. In various embodiments or scenarios, the lidar system may increase the density of scan lines, increase horizontal resolution (e.g., by decreasing the scan rate and/or increasing the pulse repetition frequency, transmit a new pulse in response to detecting a return, rather than transmitting a new pulse at a predetermined fixed time interval), while increasing the density and horizontal resolution of scan lines. As discussed herein, the lidar system may vary the scan rate and line density by controlling the speed at which one or more mirrors of the scanner pivot, rotate, or otherwise move.
In some embodiments, the light source is a direct emitting laser diode that directly produces emitted light pulses that are scanned over the field of view (e.g., optical pulses from the laser diode that are not amplified by the optical amplifier). Typically, when the light source comprises a direct-emitting laser diode, the lidar system can vary the pulse energy and pulse repetition frequency independently of each other so long as the maximum power handling capability of the laser diode is not exceeded (e.g., to ensure that the laser diode does not experience thermal failure or optical damage to the output facet).
Alternatively, the light source may be implemented as light source 520 of fig. 13 and include a pulsed laser diode followed by one or more optical magnifications. In this case, the lidar system varies the pulse energy approximately inversely with the pulse repetition frequency. Thus, as the pulse repetition frequency increases, the pulse energy decreases. As a more specific example, a lidar system may double the pulse repetition frequency and correspondingly reduce the pulse energy by approximately one-half. If higher pulse energies are desired, the lidar system may need to operate a fiber laser at a reduced pulse repetition frequency.
In one embodiment, the lidar system addresses this limitation by dynamically varying the pump laser power (i.e., varying the power of the pump laser diode that provides optical pumping to the optical gain fiber in the fiber amplifier). For example, referring to fig. 14, the lidar system may vary the power of the pump laser 530. Thus, to achieve higher pulse energy without reducing pulse repetition frequency, the lidar system may increase the pump laser power supplied to gain fiber 534. Similarly, to increase the pulse repetition frequency without reducing the pulse energy, the lidar system may increase the pump laser power supplied to gain fiber 534, while also increasing the pulse repetition frequency of seed laser diode 522 of fig. 13.
Because the lidar system may determine the distance from the lidar system to the ground along the IFOV of the transmitted light pulse, the lidar system may also adjust the output power to account for the fact that: as an example, a light pulse that is expected to strike the ground after traveling 20 meters does not require the same energy as a light pulse that is expected to strike an object 200 meters away. Thus, the lidar system may reduce the output energy of the light pulse when the lidar system expects that the light pulse will strike the ground at a distance less than the maximum range of the lidar system or less than the maximum range at which the lidar system is configured to detect a target.
However, in some cases, the lidar system may increase the laser power and output energy of the light pulse when the lidar system determines with high confidence that the transmitted light pulse should strike the ground but the default light pulse energy is insufficient to allow the lidar sensor to detect a return. For example, when a light pulse strikes a road at a sufficiently low glancing angle, the light pulse is reflected by the road and little light is scattered and returned to the lidar sensor, making the road difficult to detect. The relatively high optical absorption of the ground may be another or additional reason for the low amount of scattered light returning to the lidar system when scanning the ground portion of the field of view. For example, the road may be made of a dark material that tends to absorb incident light, so that a relatively small portion of the light pulse (e.g., < 30%) is scattered. Increasing the energy of the light pulse increases the likelihood that some scattered light will be detected by the receiver of the lidar system. In some cases, the energy of the light pulse may be higher in such a scenario than the energy required to detect an object at the default maximum range of the lidar system. In some embodiments, the lidar system is configured to estimate an angle at which the outward pulse will be incident on the ground, and to increase the pulse energy when the angle is below a certain threshold.
In order to change the scanning parameters associated with a scanner, such as the scanner shown in fig. 2 or 3, the lidar system may modify the speed at which one or more mirrors pivot. For example, to reduce the scan rate in the horizontal dimension and thereby increase horizontal resolution, a lidar system implementing the scanner 162 of fig. 2 may reduce the speed at which the mirror 180-1 (responsible for scanning the pulses of light in the horizontal dimension) pivots about the corresponding axis. More specifically, depending on the implementation, the controller may provide corresponding signals to the galvanometer scanner or the motor, for example. On the other hand, to increase the density of scan lines, the controller may decrease the speed at which mirror 180-2 (responsible for scanning the pulses of light in the vertical dimension) pivots about the corresponding axis.
When the laser radar system implements a scanner using the polygon mirror 202 of fig. 3, the controller of the laser radar system may reduce the speed at which the polygon mirror 202 rotates to improve horizontal resolution. The controller may decrease the speed at which the scan mirrors 212A and 212B rotate about their respective axes to increase the linear density.
In some embodiments, the lidar system may scan the energy view in the ground portion of the energy view according to one scan line density (corresponding to a pixel density along the vertical dimension) and may scan the energy view in the remainder of the energy view according to another scan line density. As shown in fig. 19, for example, the scan 700 can include scan lines 702A, 702B, 702C, etc. spaced at an angular offset in a portion of the field of view that does not overlap the floor, and scan lines 704A, 704B, 704C, etc. spaced at a smaller angular offset in a portion of the floor of the field of view. Lidar systems may similarly use two values for horizontal resolution. Fig. 20 illustrates a scan 710 that includes scan lines 712A, 712B, 712C, etc. having a certain horizontal resolution in a portion of the field of view that does not overlap the floor, and scan lines 714A, 714B, 714C, etc. having a different horizontal resolution in the portion of the floor of the field of view. More specifically, in this example scenario, the pixel density in the horizontal direction in scan line 712 is higher than the pixel density in the horizontal direction in scan line 714.
Unlike scans 700 and 710 of fig. 19 and 20, fig. 21 shows how the lidar system may use more than two respective values of the ground portion and the remainder of the field of view to vary scanning parameters, such as density of scan lines, horizontal resolution, pulse repetition frequency, pulse energy, and the like. When scanning a field of view that includes a portion of the ground bounded by a vertical angle 720, the lidar system may change the horizontal resolution according to a plot 722, change the pulse repetition frequency according to a plot 724, and change the pulse energy according to a plot 726. For example, graphs 722, 724, and 726 may correspond to preconfigured or dynamic values set by the lidar system or the vehicle controller from previous scans. For example, the controller of the lidar system may implement a dynamic algorithm or machine learning to select appropriate scanning parameters for the ground portion, or a combination of appropriate scanning parameters (e.g., a combination of horizontal resolution and average laser power over time). As another example, the lidar system may determine an expected distance from the lidar system to a point on the ground along the IFOV path of the light source, and select a certain horizontal resolution based on the expected distance.
Thus, in some cases, the lidar system combines some of the adjustments to the scan parameters discussed above. For example, the lidar system may use the following options or combinations: increased pulse repetition frequency and reduced laser power; increased laser power for low grazing angles and reduced, default or increased pulse repetition frequency; and, the default laser power is used for the following cases: the ground or road is not expected to be in the IFOV of the light source. The lidar system may dynamically modify the laser power and pulse repetition frequency characteristics based on expected ground exposure to ensure that the average laser power during a certain time period is within the capability of the laser of the lidar sensor.
Additionally, the lidar system may maintain the average power of the laser at or below a particular threshold power to ensure that the lidar system meets eye safety requirements. For example, the lidar system may increase pulse energy when scanning the ground at glancing angles when scanning over the energy field of view, and decrease pulse energy when scanning portions of the energy field of view that include objects relatively close to the lidar system (e.g., <30 meters) or objects that produce relatively large amounts of scattered light. Although the lidar system may change the pulse energy as it scans across the field of view, the overall average optical power emitted by the lidar system during the scan may be kept below the threshold average optical power to ensure that the lidar system operates in an eye-safe manner. As another example, the lidar system may vary the pulse repetition frequency as it scans over the field of view (e.g., using a higher pulse repetition frequency when scanning areas requiring higher resolution; while scanning other areas at a lower pulse repetition frequency), so overall, the average optical power for scanning over the field of view remains below a certain threshold power.
In some embodiments, the lidar system may adjust the pulse repetition frequency and/or pulse energy based on an amount of scattered light produced by an object in the field of view. For example, a lidar system may increase pulse energy when scanning an area of the energy field of an object that produces a relatively small amount of scattered light; and to reduce pulse energy when scanning objects that produce a relatively large amount of scattered light. Referring back to fig. 15, for example, due to distance, glancing angle, and in some cases the type of road surface, the area 608 may be expected to produce little scattered light, while the vehicle 606 may have a retro-reflector (retro reflector) or simply a highly reflective surface that produces a large amount of scattered light, and the road marking 604 may produce a large amount of scattered light due to the paint used to make the marking. In general, objects that produce a small amount of scattered light (also referred to as low scattering objects) may include objects over a certain distance (e.g., >150 meters), dark or absorbing objects (e.g., dark surfaces of black tires or roads), or objects with relatively high specular reflection (e.g., mirrors). Objects that produce a large amount of scattered light (also known as highly scattering objects) may include nearby objects (e.g., objects within <20 meters) or bright objects that produce diffusely scattered light (e.g., white shirts or white road markings).
Referring again to fig. 16, after adjusting one or more scan parameters at block 624, the flow of method 620 proceeds to block 626. The lidar system scans the ground portion of the field of view according to the adjusted scanning parameters. As a result, scanning the energy view area may result in a high resolution scan of a portion of the energy view area, and/or a more accurate scan, due to improved laser power distribution.
To further clarify, fig. 22 shows an example timing sequence of the outbound pulses of lidar system 100 when lidar system 100 transmits the next outbound pulse upon detecting a return, rather than upon expiration of a period of some fixed predetermined duration. The pulse timing diagram 800 schematically illustrates when the controller 150 provides a signal to the optical source 110 to trigger the emission of a light pulse. As shown in pulse timing diagram 800, the period between pulses of light varies based on when receiver 140 detects a return pulse corresponding to a previous pulse of light.
In the example shown, after laser radar system 100 transmits pulse N, receiver 140 detects a return pulse corresponding to pulse N after time interval T1. In response to determining that receiver 140 has received pulse N, controller 150 generates signal 810. Signal 810 causes laser radar system 100 to transmit pulse N + 1. For clarity, FIG. 22 also shows a short delay between the time pulse N returns and pulse N +1 exits lidar system 100. This delay corresponds to the time it takes signal 810 to propagate through laser radar system 100.
In the scenario of fig. 22, laser radar system 100 transmits pulse N +1, but in time T2 required for the light pulse to travel to the target placed at the maximum range and return to laser radar system 100, a return pulse corresponding to pulse N +1 is not received. In this case, laser radar system 100 generates signal 612 and transmits the next pulse N +2 upon expiration of the time period of duration T2. As further shown in fig. 22, after a time period T3, the receiver 140 receives a return pulse corresponding to the transmit pulse N + 2. Because T1< T2 and T3< T2 in this case, the lidar system achieves a higher pulse rate than a fixed pulse rate in which each pair of adjacent pulses is separated by a time interval of duration T2.
General considerations of
In some cases, a computing device may be used to implement the various modules, circuits, systems, methods, or algorithm steps disclosed herein. As an example, all or a portion of the modules, circuits, systems, methods, or algorithms disclosed herein may be implemented or performed by: a general purpose single-or multi-chip processor, a Digital Signal Processor (DSP), an ASIC, an FPGA, any other suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any suitable combination thereof. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
In particular embodiments, one or more implementations of the subject matter described herein may be implemented as one or more computer programs (e.g., one or more modules of computer program instructions encoded or stored on a computer-readable non-transitory storage medium). As an example, the steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module, which may reside on a computer-readable non-transitory storage medium. In particular embodiments, the computer-readable non-transitory storage medium may include any suitable storage medium that can be used to store or transfer computer software and that can be accessed by a computer system. Herein, a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other Integrated Circuits (ICs) (e.g., Field Programmable Gate Arrays (FPGAs) or application specific ICs (asics)), Hard Disk Drives (HDDs), hybrid hard disk drives (HHDs), optical disks (e.g., Compact Disks (CDs), CD-ROMs, Digital Versatile Disks (DVDs), blu-ray or laser disks), Optical Disk Drives (ODDs), magneto-optical disks, magneto-optical disk drives, floppy disks, Floppy Disk Drives (FDDs), magnetic tape, flash memory, Solid State Drives (SSDs), RAMs, RAM drives, ROMs, secure digital cards or drives, any other suitable computer-readable non-transitory storage medium, or any suitable combination of two or more of these, where appropriate. A computer-readable non-transitory storage medium may be volatile, non-volatile, or a combination of volatile and non-volatile, where appropriate.
In some instances, certain features are described herein in the context of separate embodiments and may also be combined and implemented in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Furthermore, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination. Or a variation of a sub-combination.
Although operations may be depicted in the drawings as occurring in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all operations be performed. Further, the figures may schematically depict more example processes or methods in the form of a flow chart or sequence diagram. However, other operations not described may be included in the example processes or methods schematically illustrated. For example, one or more additional operations may be performed before, after, concurrently with, or between any of the illustrated operations. Further, one or more operations depicted in the figures may be repeated, where appropriate. Additionally, the operations depicted in the figures may be performed in any suitable order. Further, although particular components, devices, or systems are described herein as performing particular operations, any suitable operations or combinations of operations may be performed using any suitable combination of any suitable components, devices, or systems. In some cases, multitasking or parallel processing operations may be performed. Moreover, the separation of various system components in the embodiments described herein should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can be integrated within a single software product or packaged within multiple software products.
Various embodiments have been described in connection with the accompanying drawings. It should be understood, however, that the drawings are not necessarily drawn to scale. By way of example, the distances or angles depicted in the figures are schematic and do not necessarily have an exact relationship to the actual dimensions or layout of the apparatus shown.
The scope of the present disclosure includes all changes, substitutions, variations, alterations, and modifications to the example embodiments described or illustrated herein that a person having ordinary skill in the art would comprehend. The scope of the present disclosure is not limited to the example embodiments described or illustrated herein. Moreover, although the present disclosure describes or illustrates various embodiments herein as including particular components, elements, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would understand.
The term "or" as used herein is to be interpreted as inclusive or meaning any one or any combination unless explicitly indicated otherwise or otherwise indicated by context. Thus, herein, the expression "a or B" means "A, B or both a and B". As another example, "A, B or C" herein means at least one of: a; b; c; a and B; a and C; b and C; A. b and C. Exceptions to this definition may occur if a combination of elements, devices, steps or operations are in some way inherently mutually exclusive.
As used herein, approximating language, such as, but not limited to, "about", "substantially" or "about", refers to the situation when so modified, which is understood to not necessarily be absolute or perfect, but rather as being deemed sufficiently close to the conditions for those of ordinary skill in the art to warrant the specified presentation. The extent to which the description may vary will depend on how much variation may be made and one of ordinary skill in the art may still recognize the modified feature as having the desired characteristics or capabilities of the unmodified feature. In general, but with the preceding discussion in mind, a numerical value modified herein by a approximating word (e.g., "about") may vary from the stated value by ± 0.5%, 1%, 2%, 3%, 4%, 5%, 10%, 12%, or 15%.
As used herein, the terms "first," "second," "third," and the like may be used as labels in their precedent noun, and these terms do not necessarily imply a particular order (e.g., a particular spatial, temporal, or logical order). As an example, the system may be described as determining a "first result" and a "second result," and the terms "first" and "second" may not necessarily imply that the first result is determined before the second result.
As used herein, the terms "based on" and "based at least in part on" may be used to describe or present one or more factors that affect a determination, and these terms may not exclude additional factors that may affect a determination. The determination may be based only on those factors presented, or may be based at least in part on those factors. The phrase "determining a based on B" means that B is a factor that affects the determination of a. In some cases, other factors may also contribute to the determination of a. In other instances, a may be determined based on B alone.

Claims (36)

1. A lidar system comprising:
a light source configured to emit pulses of light;
a scanner configured to scan at least a portion of the pulses of emitted light along a scan pattern contained within an energy field of view of the lidar system, wherein the energy field of view includes a ground portion that overlaps an area of ground in front of the lidar system;
a receiver configured to detect at least a portion of the scanned pulses of light scattered by one or more remote targets; and
a processor configured to:
identifying the ground portion of the field of view, an
When the transmitted pulse scans the ground portion of the field of view during a subsequent scan of the field of view, adjusting scan parameters such that at least one of a resolution or a pulse energy of the ground portion of the field of view is modified relative to another portion of the field of view.
2. The lidar system according to claim 1, wherein, to identify the ground portion of the viewable area, the processor is configured to determine one or more locations on the ground based on data from a camera or based on previously scanned data from the viewable area of the lidar system.
3. The lidar system of claim 1, wherein:
the scan pattern includes scan lines having a particular density, an
To adjust the scan parameters, the processor is configured to increase a density of the scan lines when scanning the view-enabled floor portion to perform a high resolution scan of the view-enabled floor portion.
4. The lidar system of claim 3, wherein the scanner comprises:
a first mirror configured to pivot or rotate about a first axis to scan the pulses of emitted light in a horizontal dimension, an
A second mirror configured to pivot about a second axis orthogonal to the first axis to scan the pulses of emitted light along a vertical dimension,
wherein, to increase the density of the scan lines when scanning the ground portion of the viewable area, the controller is configured to modify a speed at which the second mirror pivots about the second axis.
5. The lidar system of claim 1, wherein:
the scan pattern has a particular horizontal resolution, an
To adjust the scan parameters, the processor is configured to increase a pixel density in a horizontal direction for a floor portion of the viewable area.
6. The lidar system of claim 5, wherein:
the scan pattern includes a particular scan rate, an
To increase the horizontal resolution of the field-of-view enabled floor portion, the processor is configured to decrease the scan rate when scanning the field-of-view enabled floor portion.
7. The lidar system of claim 5, wherein:
the light source emits pulses of light at a particular pulse repetition frequency; and
to increase the horizontal resolution of the field-of-view enabled floor portion, the processor is configured to increase the pulse repetition frequency while scanning the field-of-view enabled floor portion.
8. The lidar system of claim 5, wherein, to increase the horizontal resolution of the ground portion of the viewable area, the processor is configured to:
when scanning the portion of the field of view outside the ground portion, each new light pulse is emitted only after a fixed duration interval T since the emission of the previous light pulse, an
Emitting a new light pulse in response to detecting scattered light from a previous light pulse when scanning the ground portion of the field of view or when scattered light from a previous light pulse is not detected after the fixed duration interval T.
9. The lidar system of claim 5, wherein, to increase the horizontal resolution of the ground portion of the viewable area, the processor is configured to:
determining an expected distance from the lidar system to a point on the ground along an instantaneous field of view (IFOV) path of the light source, an
Selecting a horizontal resolution based on the expected distance.
10. The lidar system of claim 1, wherein:
to adjust the scanning parameter, the processor is configured to reduce a pulse energy of the pulses of emitted light as the pulses of emitted light scan the ground portion of the viewable area.
11. The lidar system of claim 10, wherein the processor is further configured to: increasing the pulse energy as the pulses of emitted light scan other portions of the visible area such that an average power of the pulses of emitted light for subsequent scans of the visible area is less than or equal to a particular threshold average power.
12. The lidar system of claim 1, wherein the processor is further configured to: increasing the pulse energy when a pulse of the emitted light is incident on a point on the ground in front of the lidar system at a grazing angle below a certain threshold.
13. The lidar system of claim 1, wherein:
the processor is further configured to determine a low scattering portion of the viewable area that produces a relatively low amount of scattered light and a high scattering portion of the viewable area that produces a relatively high amount of scattered light based on a previous scan of the viewable area, and
to adjust the scan parameter, the processor is configured to perform at least one of:
increasing the pulse energy when the pulse of emitted light scans a low scatter portion of the energy field of view, an
Reducing the pulse energy when the emitted pulse of light scans the highly scattered portion of the visible field.
14. The lidar system of claim 1, wherein the processor is further configured to:
obtaining an estimate of the absorption of a ground surface located in front of the lidar system, an
The scan parameters are further adjusted according to the obtained estimate of absorption.
15. The lidar system of claim 1, wherein the light source comprises:
a pulsed laser diode configured to generate optical seed pulses; and
one or more optical amplifiers configured to amplify the optical seed pulses to produce pulses of the emitted light.
16. The lidar system of claim 15, wherein:
each of the one or more optical amplifiers includes an optical gain fiber and one or more pump laser diodes that provide an amount of optical pump power to the gain fiber; and
to adjust the scanning parameter, the processor is configured to increase the pulse energy, including increasing an amount of optical pumping power provided to the gain fiber, as the pulses of emitted light scan the ground portion of the field of view.
17. The lidar system of claim 1, wherein:
the light source comprises a direct emitting laser diode configured to generate pulses of emitted light; and
the processor is configured to vary a pulse energy from the direct-emitting laser diode independent of a pulse repetition frequency of the direct-emitting laser diode.
18. A method in a lidar system for scanning a field-of-view of the lidar system, the method comprising:
identifying, within the field of view, a portion of ground that overlaps an area of ground in front of the lidar system;
causing a light source to emit pulses of light;
scanning at least a portion of the emitted pulses of light along a scanning pattern contained within the field of view, including adjusting scanning parameters such that at least one of a resolution or pulse energy of a ground portion of the field of view is modified relative to another portion of the field of view; and
at least a portion of the scanned pulses of light scattered by one or more remote targets are detected.
19. The method of claim 18, wherein identifying the ground portion of the viewable area comprises: determining one or more locations on the ground based on data from a camera or based on previously scanned data from the field of view of the lidar system.
20. The method of claim 18, wherein:
the scan pattern includes scan lines having a particular density, an
Adjusting the scan parameters includes: increasing a density of the scan lines when scanning the view-enabled floor portion to perform a high resolution scan of the view-enabled floor portion.
21. The method of claim 18, wherein:
the scan pattern has a particular horizontal resolution, an
Adjusting the scan parameter includes increasing a horizontal resolution of the ground portion of the field of view.
22. The method of claim 21, wherein:
the scan pattern includes a particular scan rate, an
Increasing the horizontal resolution of the ground portion of the viewable area includes decreasing the scan rate when scanning the ground portion of the viewable area.
23. The method of claim 21, wherein:
the light source emits pulses of light at a particular pulse repetition frequency; and
increasing the horizontal resolution of the field-of-view enabled floor portion comprises increasing the pulse repetition frequency while scanning the field-of-view enabled floor portion.
24. The method of claim 21, wherein increasing the horizontal resolution of the ground portion of the viewable area comprises:
when scanning the portion of the field of view outside the ground portion, each new light pulse is emitted only after a fixed duration interval T1 since the emission of the previous light pulse, an
Emitting a new light pulse in response to detecting scattered light from a previous light pulse when scanning the ground portion of the field of view or when scattered light from a previous light pulse is not detected after the fixed duration interval T1.
25. The method of claim 21, wherein increasing the horizontal resolution of the ground portion of the viewable area comprises:
determining an expected distance from the lidar system to a point on the ground along an instantaneous field of view (IFOV) path of the light source, an
Selecting a horizontal resolution based on the expected distance.
26. The method of claim 18, wherein adjusting the scan parameters comprises: reducing a pulse energy of the pulses of emitted light as the pulses of emitted light scan a ground portion of the viewable area.
27. The method of claim 26, further comprising:
increasing the pulse energy as the pulses of emitted light scan other portions of the visible area such that an average power of the pulses of emitted light for subsequent scans of the visible area is less than or equal to a particular threshold average power.
28. The method of claim 18, further comprising: increasing the pulse energy when a pulse of the emitted light is incident on a point on the ground in front of the lidar system at a grazing angle below a certain threshold.
29. The method of claim 18, further comprising:
determining a low scattering fraction of the energy view that produces a relatively low amount of scattered light and a high scattering fraction of the energy view that produces a relatively high amount of scattered light based on a previous scan of the energy view, and
wherein adjusting the scan parameters comprises:
increasing the pulse energy when the pulse of emitted light scans a low scatter portion of the energy field of view, an
Reducing the pulse energy when the emitted pulse of light scans the highly scattered portion of the visible field.
30. An autonomous vehicle comprising:
a vehicle steering assembly to effect at least steering, acceleration, and braking of the autonomous vehicle; and
a lidar system comprising:
a light source configured to emit a pulse of light,
a scanner configured to scan at least a portion of the pulses of emitted light along a scanning pattern contained within an energy field of the lidar system, wherein the energy field includes a ground portion that overlaps with an area of ground in front of the lidar system, and
a receiver configured to detect at least a portion of the scanned pulses of light scattered by one or more remote targets; and
a vehicle controller communicatively coupled to the vehicle handling assembly and the lidar system, the vehicle controller configured to control the vehicle handling assembly using signals generated by the lidar system;
wherein the lidar system is configured to:
when the transmitted pulse scans the ground portion of the field of view during a subsequent scan of the field of view, adjusting scan parameters such that at least one of a resolution or a pulse energy of the ground portion of the field of view is modified relative to another portion of the field of view.
31. The autonomous vehicle of claim 30, wherein the one or more remote targets comprise a ground surface or one or more vehicles.
32. The autonomous-capable vehicle of claim 30, wherein:
the lidar system is included in a vehicle; and
the ground area includes at least a portion of a road on which the vehicle is operating.
33. The autonomous-capable vehicle of claim 30, wherein the lidar system is configured to adjust the scanning parameters in accordance with a command received from the vehicle controller.
34. The autonomous vehicle of claim 30, wherein to identify the ground portion of the viewable area, the lidar system is configured to determine one or more locations on the ground based on data from a camera or based on previously scanned data from the viewable area of the lidar system.
35. The autonomous-capable vehicle of claim 30, wherein:
the scan pattern includes scan lines having a particular density, an
The scan pattern has a particular horizontal resolution;
to adjust the scanning parameters, the lidar system is configured to increase at least one of the following while scanning the ground portion of the viewable area: (i) a density of the scan lines or (ii) a horizontal resolution to perform a high resolution scan of the field-of-view enabled floor portion.
36. The autonomous-capable vehicle of claim 30, wherein:
to adjust the scan parameter, the lidar system is configured to reduce a pulse energy of the pulses of transmitted light as the pulses of transmitted light scan the ground portion of the viewable area.
CN201980057112.1A 2018-07-19 2019-07-19 Adjustable pulse characteristics for ground detection in a lidar system Pending CN112955776A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16/040263 2018-07-19
US16/040,263 US10627516B2 (en) 2018-07-19 2018-07-19 Adjustable pulse characteristics for ground detection in lidar systems
PCT/US2019/042612 WO2020018908A1 (en) 2018-07-19 2019-07-19 Adjustable pulse characteristics for ground detection in lidar systems

Publications (1)

Publication Number Publication Date
CN112955776A true CN112955776A (en) 2021-06-11

Family

ID=69162929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980057112.1A Pending CN112955776A (en) 2018-07-19 2019-07-19 Adjustable pulse characteristics for ground detection in a lidar system

Country Status (5)

Country Link
US (1) US10627516B2 (en)
EP (1) EP3824312A4 (en)
JP (1) JP7108789B2 (en)
CN (1) CN112955776A (en)
WO (1) WO2020018908A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220171031A1 (en) * 2020-08-24 2022-06-02 Innoviz Technologies Ltd. Multiple simultaneous laser beam emission and illumination while ensuring eye safety

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016208883A1 (en) * 2016-05-23 2017-11-23 Robert Bosch Gmbh A method for providing vehicle trajectory information and method for locating a pothole
DE102017205619A1 (en) * 2017-04-03 2018-10-04 Robert Bosch Gmbh LiDAR system and method for operating a LiDAR system
CA3069305A1 (en) 2017-07-10 2019-01-17 3D at Depth, Inc. Underwater optical positioning systems and methods
US11507097B2 (en) * 2018-02-05 2022-11-22 Pixart Imaging Inc. Control apparatus for auto clean machine and auto clean machine control method
US10627516B2 (en) 2018-07-19 2020-04-21 Luminar Technologies, Inc. Adjustable pulse characteristics for ground detection in lidar systems
US11808887B2 (en) * 2018-11-02 2023-11-07 Waymo Llc Methods and systems for mapping retroreflectors
US11460578B2 (en) * 2018-12-28 2022-10-04 Robert Bosch Gmbh 3D lidar sensing unit with oscillating axis of rotation for autonomous driving
KR102481212B1 (en) 2019-01-04 2022-12-26 블랙모어 센서스 앤드 애널리틱스, 엘엘씨 Method and system for refractive beam-steering
DE102019202040B4 (en) * 2019-02-15 2022-01-27 Zf Friedrichshafen Ag Safe autonomous agricultural machine
US11698441B2 (en) * 2019-03-22 2023-07-11 Viavi Solutions Inc. Time of flight-based three-dimensional sensing system
CN110146892B (en) * 2019-05-05 2023-08-01 浙江宜通华盛科技有限公司 Dual-polarization radar
US11754682B2 (en) 2019-05-30 2023-09-12 Microvision, Inc. LIDAR system with spatial beam combining
US11796643B2 (en) * 2019-05-30 2023-10-24 Microvision, Inc. Adaptive LIDAR scanning methods
US11828881B2 (en) 2019-05-30 2023-11-28 Microvision, Inc. Steered LIDAR system with arrayed receiver
US11556000B1 (en) 2019-08-22 2023-01-17 Red Creamery Llc Distally-actuated scanning mirror
US11754710B2 (en) * 2019-10-18 2023-09-12 Ai4 International Oy Measurement device and method of operating therefor
JP2021089405A (en) * 2019-12-06 2021-06-10 セイコーエプソン株式会社 Optical scanner, three-dimensional measuring device, and robot system
US11619582B2 (en) * 2020-07-07 2023-04-04 Gamma Scientific Inc. Retroreflectometer for non-contact measurements of optical characteristics
US11320124B1 (en) 2020-10-29 2022-05-03 Waymo Llc Infrared light module uniformity rotational test module
FR3116906B1 (en) * 2020-12-02 2023-06-30 Airbus Helicopters Method and system for detecting obstacles with an obstacle sensor for an aircraft
US20220187463A1 (en) * 2020-12-14 2022-06-16 Luminar, Llc Generating Scan Patterns Using Cognitive Lidar
US11802945B2 (en) 2021-03-10 2023-10-31 Allegro Microsystems, Llc Photonic ROIC having safety features
US11581697B2 (en) 2021-03-10 2023-02-14 Allegro Microsystems, Llc Detector system comparing pixel response with photonic energy decay
US11635495B1 (en) 2021-03-26 2023-04-25 Aeye, Inc. Hyper temporal lidar with controllable tilt amplitude for a variable amplitude scan mirror
US11686846B2 (en) 2021-03-26 2023-06-27 Aeye, Inc. Bistatic lidar architecture for vehicle deployments
US11822016B2 (en) 2021-03-26 2023-11-21 Aeye, Inc. Hyper temporal lidar using multiple matched filters to orient a lidar system to a frame of reference
US20230044929A1 (en) 2021-03-26 2023-02-09 Aeye, Inc. Multi-Lens Lidar Receiver with Multiple Readout Channels
US11630188B1 (en) 2021-03-26 2023-04-18 Aeye, Inc. Hyper temporal lidar with dynamic laser control using safety models
US11460556B1 (en) 2021-03-26 2022-10-04 Aeye, Inc. Hyper temporal lidar with shot scheduling for variable amplitude scan mirror
US20220317249A1 (en) * 2021-03-26 2022-10-06 Aeye, Inc. Hyper Temporal Lidar with Switching Between a Baseline Scan Mode and a Pulse Burst Mode
JP2022152236A (en) * 2021-03-29 2022-10-12 株式会社小糸製作所 Measurement device
CN113189605B (en) * 2021-04-08 2022-06-17 中电海康集团有限公司 Method and system for improving laser ranging precision based on uncertainty
KR102312012B1 (en) * 2021-04-13 2021-10-12 세종대학교산학협력단 Aerial analysis of ground surface using distance sensor of unmanned aerial vehicle
US11601733B2 (en) 2021-04-14 2023-03-07 Allegro Microsystems, Llc Temperature sensing of a photodetector array
US11815406B2 (en) 2021-04-14 2023-11-14 Allegro Microsystems, Llc Temperature sensing of an array from temperature dependent properties of a PN junction
US11770632B2 (en) 2021-04-14 2023-09-26 Allegro Microsystems, Llc Determining a temperature of a pixel array by measuring voltage of a pixel
EP4099052A1 (en) 2021-06-03 2022-12-07 Allegro MicroSystems, LLC Arrayed time to digital converter
US11252359B1 (en) 2021-06-21 2022-02-15 Allegro Microsystems, Llc Image compensation for sensor array having bad pixels
US11600654B2 (en) 2021-07-15 2023-03-07 Allegro Microsystems, Llc Detector array yield recovery
US11885646B2 (en) 2021-08-12 2024-01-30 Allegro Microsystems, Llc Programmable active pixel test injection
US11585910B1 (en) 2021-08-13 2023-02-21 Allegro Microsystems, Llc Non-uniformity correction of photodetector arrays
EP4215936A1 (en) * 2022-01-25 2023-07-26 Robert Bosch GmbH Lidar system
US11933669B2 (en) 2022-02-16 2024-03-19 Allegro Microsystems, Llc Optical system for improved reliability and performance
CN114415192B (en) * 2022-03-28 2022-07-05 北京一径科技有限公司 Laser radar system and calibration method thereof
US11722141B1 (en) 2022-04-22 2023-08-08 Allegro Microsystems, Llc Delay-locked-loop timing error mitigation
CN115792930B (en) * 2023-02-06 2023-04-18 长沙思木锐信息技术有限公司 Laser radar capable of orthogonal receiving and transmitting and scanning method and system thereof
CN115841766B (en) * 2023-02-22 2023-07-04 青岛慧拓智能机器有限公司 Parking spot recommendation method for mining area operation area

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002040139A (en) * 2000-07-28 2002-02-06 Denso Corp Method and device for body recognition, and recording medium
CN103076614A (en) * 2013-01-18 2013-05-01 山东理工大学 Laser scanning method and device for helicopter collision avoidance
CN103197321A (en) * 2013-03-22 2013-07-10 北京航空航天大学 Full-waveform laser radar system
US20130317649A1 (en) * 2012-05-25 2013-11-28 United States Of America As Represented By The Secretary Of The Navy Nodding Mechanism For A Single-Scan Sensor
CN105093925A (en) * 2015-07-15 2015-11-25 山东理工大学 Measured-landform-feature-based real-time adaptive adjusting method and apparatus for airborne laser radar parameters
US20170168146A1 (en) * 2015-12-15 2017-06-15 Uber Technologies, Inc. Dynamic lidar sensor controller
CN107430195A (en) * 2015-03-25 2017-12-01 伟摩有限责任公司 Vehicle with the detection of multiple light and range unit (LIDAR)
CN107687818A (en) * 2016-08-04 2018-02-13 纬创资通股份有限公司 Three-dimensional measurement method and three-dimensional measurement device
US20180074203A1 (en) * 2016-09-12 2018-03-15 Delphi Technologies, Inc. Lidar Object Detection System for Automated Vehicles
CN108010121A (en) * 2016-10-27 2018-05-08 莱卡地球系统公开股份有限公司 Method for processing scan data

Family Cites Families (137)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5006721A (en) 1990-03-23 1991-04-09 Perceptron, Inc. Lidar scanning system
JPH07134178A (en) 1993-11-12 1995-05-23 Omron Corp On-vehicle distance measuring device using laser beam
US5689519A (en) * 1993-12-20 1997-11-18 Imra America, Inc. Environmentally stable passively modelocked fiber laser pulse source
US7209221B2 (en) * 1994-05-23 2007-04-24 Automotive Technologies International, Inc. Method for obtaining and displaying information about objects in a vehicular blind spot
US8041483B2 (en) * 1994-05-23 2011-10-18 Automotive Technologies International, Inc. Exterior airbag deployment techniques
US6318634B1 (en) 1998-08-13 2001-11-20 Psc Scanning, Inc. Speed variable angle facet wheel for scanner
US6266442B1 (en) * 1998-10-23 2001-07-24 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US6891960B2 (en) * 2000-08-12 2005-05-10 Facet Technology System for road sign sheeting classification
JP4595197B2 (en) 2000-12-12 2010-12-08 株式会社デンソー Distance measuring device
US6723975B2 (en) * 2001-02-07 2004-04-20 Honeywell International Inc. Scanner for airborne laser system
DE10110420A1 (en) * 2001-03-05 2002-09-12 Sick Ag Device for determining a distance profile
US6542226B1 (en) 2001-06-04 2003-04-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Planar particle imaging and doppler velocimetry system and method
EP1267177A1 (en) * 2001-06-15 2002-12-18 IBEO Automobile Sensor GmbH Method and device for objects location finding in space
DE10143060A1 (en) * 2001-09-03 2003-03-20 Sick Ag Vehicle laser scanner transmits wide beam front towards moving deflector, causing reflective front to adopt various orientations in scanned space
US6665063B2 (en) * 2001-09-04 2003-12-16 Rosemount Aerospace Inc. Distributed laser obstacle awareness system
DE10153270A1 (en) * 2001-10-29 2003-05-08 Sick Ag Optoelectronic distance measuring device
US8457230B2 (en) 2002-08-21 2013-06-04 Broadcom Corporation Reconfigurable orthogonal frequency division multiplexing (OFDM) chip supporting single weight diversity
DE10244641A1 (en) * 2002-09-25 2004-04-08 Ibeo Automobile Sensor Gmbh Optoelectronic position monitoring system for road vehicle has two pulsed lasers, sensor and mechanical scanner with mirror at 45 degrees on shaft with calibration disk driven by electric motor
US7095488B2 (en) * 2003-01-21 2006-08-22 Rosemount Aerospace Inc. System for profiling objects on terrain forward and below an aircraft utilizing a cross-track laser altimeter
EP1706920A4 (en) 2003-12-04 2008-01-23 Optical Air Data Systems Lp Very high power pulsed fiber laser
US7583364B1 (en) 2004-03-19 2009-09-01 University Corporation For Atmospheric Research High pulse-energy, eye-safe lidar system
DE102004033114A1 (en) 2004-07-08 2006-01-26 Ibeo Automobile Sensor Gmbh Method for calibrating a distance image sensor
IL165212A (en) * 2004-11-15 2012-05-31 Elbit Systems Electro Optics Elop Ltd Device for scanning light
US7532311B2 (en) * 2005-04-06 2009-05-12 Lockheed Martin Coherent Technologies, Inc. Efficient lidar with flexible target interrogation pattern
US7652752B2 (en) * 2005-07-14 2010-01-26 Arete' Associates Ultraviolet, infrared, and near-infrared lidar system and method
US7711014B2 (en) 2005-10-11 2010-05-04 Clear Align Llc Apparatus and method for generating short optical pulses
CA2753398A1 (en) 2005-11-10 2007-07-26 Optical Air Data Systems, Llc Single aperture multiple optical waveguide transceiver
US8050863B2 (en) * 2006-03-16 2011-11-01 Gray & Company, Inc. Navigation and control system for autonomous vehicles
US7626193B2 (en) * 2006-03-27 2009-12-01 Princeton Lightwave, Inc. Apparatus comprising a single photon photodetector having reduced afterpulsing and method therefor
US7443903B2 (en) * 2006-04-19 2008-10-28 Mobius Photonics, Inc. Laser apparatus having multiple synchronous amplifiers tied to one master oscillator
USRE46672E1 (en) 2006-07-13 2018-01-16 Velodyne Lidar, Inc. High definition LiDAR system
US8767190B2 (en) * 2006-07-13 2014-07-01 Velodyne Acoustics, Inc. High definition LiDAR system
CN101688774A (en) * 2006-07-13 2010-03-31 威力登音响公司 High definition lidar system
US7701558B2 (en) * 2006-09-22 2010-04-20 Leica Geosystems Ag LIDAR system
WO2008052365A1 (en) * 2006-10-30 2008-05-08 Autonosys Inc. Scanning system for lidar
US7872794B1 (en) 2007-01-21 2011-01-18 Lockheed Martin Corporation High-energy eye-safe pulsed fiber amplifiers and sources operating in erbium's L-band
US7577178B2 (en) 2007-01-23 2009-08-18 Peter Dragic Narrow linewidth injection seeded Q-switched fiber ring laser based on a low-SBS fiber
US7649920B2 (en) * 2007-04-03 2010-01-19 Topcon Corporation Q-switched microlaser apparatus and method for use
US20080309913A1 (en) 2007-06-14 2008-12-18 James John Fallon Systems and methods for laser radar imaging for the blind and visually impaired
SE534852C2 (en) 2007-07-05 2012-01-24 Atlas Copco Tools Ab Torque sensing unit for a power tool
US7945408B2 (en) * 2007-09-20 2011-05-17 Voxis, Inc. Time delay estimation
US8138849B2 (en) * 2007-09-20 2012-03-20 Voxis, Inc. Transmission lines applied to contact free slip rings
US8027029B2 (en) * 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system
US20090140887A1 (en) 2007-11-29 2009-06-04 Breed David S Mapping Techniques Using Probe Vehicles
US20090273770A1 (en) 2008-04-30 2009-11-05 Honeywell International Inc. Systems and methods for safe laser imaging, detection and ranging (lidar) operation
DE102008032216A1 (en) * 2008-07-09 2010-01-14 Sick Ag Device for detecting the presence of an object in space
US8126642B2 (en) * 2008-10-24 2012-02-28 Gray & Company, Inc. Control and systems for autonomously driven vehicles
US8364334B2 (en) * 2008-10-30 2013-01-29 Honeywell International Inc. System and method for navigating an autonomous vehicle using laser detection and ranging
DE502008001000D1 (en) * 2008-11-21 2010-09-02 Sick Ag Optoelectronic sensor and method for measuring distances according to the time of flight principle
US7982662B2 (en) 2008-12-08 2011-07-19 Intellex, Llc Scanning array for obstacle detection and collision avoidance
EP2202533A1 (en) * 2008-12-23 2010-06-30 IBEO Automobile Sensor GmbH Logging device
US8447563B2 (en) 2009-03-31 2013-05-21 The United States Of America As Represented By The Secretary Of The Navy Method and system for determination of detection probability or a target object based on a range
US8675181B2 (en) * 2009-06-02 2014-03-18 Velodyne Acoustics, Inc. Color LiDAR scanner
EP2449637B1 (en) * 2009-06-30 2013-04-10 Trimble AB Optical pulse transmitter
JP5710108B2 (en) 2009-07-03 2015-04-30 日本信号株式会社 Optical distance measuring device
US8279420B2 (en) * 2009-08-06 2012-10-02 ISC8 Inc. Phase sensing and scanning time of flight LADAR using atmospheric absorption bands
US9091754B2 (en) * 2009-09-02 2015-07-28 Trimble A.B. Distance measurement methods and apparatus
US8081301B2 (en) * 2009-10-08 2011-12-20 The United States Of America As Represented By The Secretary Of The Army LADAR transmitting and receiving system and method
US8934509B2 (en) 2009-11-23 2015-01-13 Lockheed Martin Corporation Q-switched oscillator seed-source for MOPA laser illuminator method and apparatus
WO2011127375A1 (en) * 2010-04-09 2011-10-13 Pochiraju Kishore V Adaptive mechanism control and scanner positioning for improved three-dimensional laser scanning
US8260539B2 (en) * 2010-05-12 2012-09-04 GM Global Technology Operations LLC Object and vehicle detection and tracking using 3-D laser rangefinder
KR101746499B1 (en) * 2010-12-23 2017-06-14 한국전자통신연구원 System of Dynamic Range Three-dimensional Image
US20140293266A1 (en) 2011-08-04 2014-10-02 Ying Hsu Local Alignment and Positioning Device and Method
US9002151B2 (en) * 2011-10-05 2015-04-07 Telcordia Technologies, Inc. System and method for nonlinear optical devices
US9041136B2 (en) * 2011-10-20 2015-05-26 Agency For Science, Technology And Research Avalanche photodiode
DE102011119707A1 (en) * 2011-11-29 2013-05-29 Valeo Schalter Und Sensoren Gmbh Optical measuring device
DE102011122345A1 (en) * 2011-12-23 2013-06-27 Valeo Schalter Und Sensoren Gmbh Optical measuring device and method for producing a cover for a housing of an optical measuring device
JP2013156138A (en) 2012-01-30 2013-08-15 Ihi Corp Moving object detecting apparatus
DE102012002922A1 (en) 2012-02-14 2013-08-14 Audi Ag Time-of-flight camera for a motor vehicle, motor vehicle and method for operating a time-of-flight camera
US9213085B2 (en) * 2012-02-16 2015-12-15 Nucript LLC System and method for measuring the phase of a modulated optical signal
USRE48914E1 (en) 2012-03-02 2022-02-01 Leddartech Inc. System and method for multipurpose traffic detection and characterization
EP2637038B1 (en) * 2012-03-07 2017-01-18 Vectronix AG Distance sensor
US20160146939A1 (en) 2014-11-24 2016-05-26 Apple Inc. Multi-mirror scanning depth engine
DE102012006869A1 (en) 2012-04-04 2013-10-10 Valeo Schalter Und Sensoren Gmbh Optoelectronic sensor device, in particular laser scanner, with an adapted receiving unit for optimized reception level reduction
WO2013177650A1 (en) 2012-04-26 2013-12-05 Neptec Design Group Ltd. High speed 360 degree scanning lidar head
US9246041B1 (en) 2012-04-26 2016-01-26 Id Quantique Sa Apparatus and method for allowing avalanche photodiode based single-photon detectors to be driven by the same electrical circuit in gated and in free-running modes
US8796605B2 (en) * 2012-05-04 2014-08-05 Princeton Lightwave, Inc. High-repetition-rate single-photon receiver and method therefor
US9157989B2 (en) 2012-08-29 2015-10-13 Trimble Ab Distance measurement methods and apparatus
US8996228B1 (en) * 2012-09-05 2015-03-31 Google Inc. Construction zone object detection using light detection and ranging
CN104603575A (en) * 2012-09-06 2015-05-06 法罗技术股份有限公司 Laser scanner with additional sensing device
GB2522142A (en) * 2012-09-14 2015-07-15 Faro Tech Inc Laser scanner with dynamical adjustment of angular scan velocity
US9383753B1 (en) 2012-09-26 2016-07-05 Google Inc. Wide-view LIDAR with areas of special attention
EP2722684B1 (en) 2012-10-19 2019-08-28 Sick Ag Laser scanner
DE102012110595A1 (en) * 2012-11-06 2014-05-08 Conti Temic Microelectronic Gmbh Method and device for detecting traffic signs for a vehicle
US9823351B2 (en) 2012-12-18 2017-11-21 Uber Technologies, Inc. Multi-clad fiber based optical apparatus and methods for light detection and ranging sensors
DE102012025281A1 (en) 2012-12-21 2014-06-26 Valeo Schalter Und Sensoren Gmbh Optical object detection device with a MEMS and motor vehicle with such a detection device
US20160047901A1 (en) 2012-12-25 2016-02-18 Quanergy Systems, Inc. Robust lidar sensor for broad weather, shock and vibration conditions
US9086481B1 (en) 2013-01-18 2015-07-21 Google Inc. Methods and systems for estimating vehicle speed
US9285477B1 (en) 2013-01-25 2016-03-15 Apple Inc. 3D depth point cloud from timing flight of 2D scanned light beam pulses
US20140211194A1 (en) 2013-01-27 2014-07-31 Quanergy Systems, Inc. Cost-effective lidar sensor for multi-signal detection, weak signal detection and signal disambiguation and method of using same
ES2512965B2 (en) 2013-02-13 2015-11-24 Universitat Politècnica De Catalunya System and method to scan a surface and computer program that implements the method
US9304154B1 (en) 2013-03-04 2016-04-05 Google Inc. Dynamic measurements of pulse peak value
US9063549B1 (en) 2013-03-06 2015-06-23 Google Inc. Light detection and ranging device with oscillating mirror driven by magnetically interactive coil
US9086273B1 (en) 2013-03-08 2015-07-21 Google Inc. Microrod compression of laser beam in combination with transmit lens
US9304203B1 (en) 2013-03-13 2016-04-05 Google Inc. Methods, devices, and systems for improving dynamic range of signal receiver
US9069060B1 (en) 2013-03-13 2015-06-30 Google Inc. Circuit architecture for optical receiver with increased dynamic range
US9048370B1 (en) 2013-03-14 2015-06-02 Google Inc. Dynamic control of diode bias voltage (photon-caused avalanche)
US20140293263A1 (en) 2013-03-28 2014-10-02 James Justice LIDAR Comprising Polyhedron Transmission and Receiving Scanning Element
US9121703B1 (en) 2013-06-13 2015-09-01 Google Inc. Methods and systems for controlling operation of a laser device
EP2824478B1 (en) * 2013-07-11 2015-05-06 Sick Ag Optoelectronic sensor and method for detecting and measuring the distance of objects in a monitored area
DE102013011853A1 (en) 2013-07-16 2015-01-22 Valeo Schalter Und Sensoren Gmbh Optoelectronic detection device and method for scanning the environment of a motor vehicle
US10126412B2 (en) 2013-08-19 2018-11-13 Quanergy Systems, Inc. Optical phased array lidar system and method of using same
US8836922B1 (en) * 2013-08-20 2014-09-16 Google Inc. Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
US9299731B1 (en) 2013-09-30 2016-03-29 Google Inc. Systems and methods for selectable photodiode circuits
US10203399B2 (en) 2013-11-12 2019-02-12 Big Sky Financial Corporation Methods and apparatus for array based LiDAR systems with reduced interference
AU2014352833B2 (en) 2013-11-22 2019-12-05 Aurora Operations, Inc. LiDAR scanner calibration
JP6146295B2 (en) 2013-12-26 2017-06-14 株式会社豊田中央研究所 Radar apparatus and speed direction measuring method
US20150192677A1 (en) 2014-01-03 2015-07-09 Quanergy Systems, Inc. Distributed lidar sensing system for wide field of view three dimensional mapping and method of using same
US9625580B2 (en) 2014-01-03 2017-04-18 Princeton Lightwave, Inc. LiDAR system comprising a single-photon detector
DE102014100696B3 (en) 2014-01-22 2014-12-31 Sick Ag Distance measuring sensor and method for detection and distance determination of objects
US9360554B2 (en) * 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
DE102014106465C5 (en) 2014-05-08 2018-06-28 Sick Ag Distance measuring sensor and method for detection and distance determination of objects
WO2015199735A1 (en) * 2014-06-27 2015-12-30 Hrl Laboratories, Llc Scanning lidar and method of producing the same
US9753351B2 (en) 2014-06-30 2017-09-05 Quanergy Systems, Inc. Planar beam forming and steering optical phased array chip and method of using same
US9476968B2 (en) 2014-07-24 2016-10-25 Rosemount Aerospace Inc. System and method for monitoring optical subsystem performance in cloud LIDAR systems
US9529087B2 (en) 2014-07-24 2016-12-27 GM Global Technology Operations LLC Curb detection using lidar with sparse measurements
US10386464B2 (en) 2014-08-15 2019-08-20 Aeye, Inc. Ladar point cloud compression
US9869753B2 (en) 2014-08-15 2018-01-16 Quanergy Systems, Inc. Three-dimensional-mapping two-dimensional-scanning lidar based on one-dimensional-steering optical phased arrays and method of using same
US9453941B2 (en) 2014-12-22 2016-09-27 GM Global Technology Operations LLC Road surface reflectivity detection by lidar sensor
US10107914B2 (en) 2015-02-20 2018-10-23 Apple Inc. Actuated optical element for light beam scanning device
US9368933B1 (en) 2015-04-06 2016-06-14 Voxtel, Inc. Er,Yb:YAB laser system
DE102015217912A1 (en) 2015-09-18 2017-03-23 Robert Bosch Gmbh Method for calibrating the runtime of a lidar sensor
EP3206045B1 (en) 2016-02-15 2021-03-31 Airborne Hydrography AB Single-photon lidar scanner
CN108885263B (en) 2016-03-21 2024-02-23 威力登激光雷达有限公司 LIDAR-based 3D imaging with variable pulse repetition
EP3433634B8 (en) 2016-03-21 2021-07-21 Velodyne Lidar USA, Inc. Lidar based 3-d imaging with varying illumination field density
JP6728837B2 (en) 2016-03-23 2020-07-22 株式会社デンソーウェーブ Laser radar device
US10451740B2 (en) 2016-04-26 2019-10-22 Cepton Technologies, Inc. Scanning lidar systems for three-dimensional sensing
KR102598711B1 (en) 2016-07-07 2023-11-06 삼성전자주식회사 lidar apparatus emitting non-uniform light and autonomous robot comprising thereof
US20180032042A1 (en) 2016-08-01 2018-02-01 Qualcomm Incorporated System And Method Of Dynamically Controlling Parameters For Processing Sensor Output Data
US10557928B2 (en) 2016-08-31 2020-02-11 Qualcomm Incorporated Methods, systems, and apparatus for dynamically adjusting radiated signals
US20180113216A1 (en) 2016-10-25 2018-04-26 Innoviz Technologies Ltd. Methods Circuits Devices Assemblies Systems and Functionally Associated Machine Executable Code for Active Optical Scanning of a Scene
US10942272B2 (en) 2016-12-13 2021-03-09 Waymo Llc Power modulation for a rotary light detection and ranging (LIDAR) device
US10114111B2 (en) 2017-03-28 2018-10-30 Luminar Technologies, Inc. Method for dynamically controlling laser power
US10254388B2 (en) 2017-03-28 2019-04-09 Luminar Technologies, Inc. Dynamically varying laser output in a vehicle in view of weather conditions
US20180284234A1 (en) 2017-03-29 2018-10-04 Luminar Technologies, Inc. Foveated Imaging in a Lidar System
US10211593B1 (en) * 2017-10-18 2019-02-19 Luminar Technologies, Inc. Optical amplifier with multi-wavelength pumping
US10627516B2 (en) 2018-07-19 2020-04-21 Luminar Technologies, Inc. Adjustable pulse characteristics for ground detection in lidar systems

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002040139A (en) * 2000-07-28 2002-02-06 Denso Corp Method and device for body recognition, and recording medium
US20130317649A1 (en) * 2012-05-25 2013-11-28 United States Of America As Represented By The Secretary Of The Navy Nodding Mechanism For A Single-Scan Sensor
CN103076614A (en) * 2013-01-18 2013-05-01 山东理工大学 Laser scanning method and device for helicopter collision avoidance
CN103197321A (en) * 2013-03-22 2013-07-10 北京航空航天大学 Full-waveform laser radar system
CN107430195A (en) * 2015-03-25 2017-12-01 伟摩有限责任公司 Vehicle with the detection of multiple light and range unit (LIDAR)
CN105093925A (en) * 2015-07-15 2015-11-25 山东理工大学 Measured-landform-feature-based real-time adaptive adjusting method and apparatus for airborne laser radar parameters
US20170168146A1 (en) * 2015-12-15 2017-06-15 Uber Technologies, Inc. Dynamic lidar sensor controller
CN107687818A (en) * 2016-08-04 2018-02-13 纬创资通股份有限公司 Three-dimensional measurement method and three-dimensional measurement device
US20180074203A1 (en) * 2016-09-12 2018-03-15 Delphi Technologies, Inc. Lidar Object Detection System for Automated Vehicles
CN108010121A (en) * 2016-10-27 2018-05-08 莱卡地球系统公开股份有限公司 Method for processing scan data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王登高等: "《军事劳动卫生学》", vol. 1, 30 June 2001, 军事医学科学出版社, pages: 158 - 160 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220171031A1 (en) * 2020-08-24 2022-06-02 Innoviz Technologies Ltd. Multiple simultaneous laser beam emission and illumination while ensuring eye safety
US11726181B2 (en) * 2020-08-24 2023-08-15 Innoviz Technologies Ltd. Multiple simultaneous laser beam emission and illumination while ensuring eye safety

Also Published As

Publication number Publication date
WO2020018908A9 (en) 2020-10-22
US10627516B2 (en) 2020-04-21
JP2021532379A (en) 2021-11-25
EP3824312A4 (en) 2022-03-23
JP7108789B2 (en) 2022-07-28
US20200025923A1 (en) 2020-01-23
WO2020018908A1 (en) 2020-01-23
EP3824312A1 (en) 2021-05-26

Similar Documents

Publication Publication Date Title
JP7108789B2 (en) Adjustable pulse characteristics for ground detection in lidar systems
US11802946B2 (en) Method for dynamically controlling laser power
US11874401B2 (en) Adjusting receiver characteristics in view of weather conditions
CN110662981B (en) Time-varying gain of an optical detector operating in a lidar system
US11846707B2 (en) Ultrasonic vibrations on a window in a lidar system
US10401481B2 (en) Non-uniform beam power distribution for a laser operating in a vehicle
US10241198B2 (en) Lidar receiver calibration
US10061019B1 (en) Diffractive optical element in a lidar system to correct for backscan
US10191155B2 (en) Optical resolution in front of a vehicle
US10684360B2 (en) Protecting detector in a lidar system using off-axis illumination
US10663595B2 (en) Synchronized multiple sensor head system for a vehicle
US10295668B2 (en) Reducing the number of false detections in a lidar system
US10088559B1 (en) Controlling pulse timing to compensate for motor dynamics
US11181622B2 (en) Method for controlling peak and average power through laser receiver
US20180284234A1 (en) Foveated Imaging in a Lidar System
US10969488B2 (en) Dynamically scanning a field of regard using a limited number of output beams

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination