WO2024081230A1 - Détection et compensation de bruit de fond lidar - Google Patents

Détection et compensation de bruit de fond lidar Download PDF

Info

Publication number
WO2024081230A1
WO2024081230A1 PCT/US2023/034809 US2023034809W WO2024081230A1 WO 2024081230 A1 WO2024081230 A1 WO 2024081230A1 US 2023034809 W US2023034809 W US 2023034809W WO 2024081230 A1 WO2024081230 A1 WO 2024081230A1
Authority
WO
WIPO (PCT)
Prior art keywords
lidar
pulse
determining
background noise
time window
Prior art date
Application number
PCT/US2023/034809
Other languages
English (en)
Inventor
Shaminda Subasingha
Samantha Marie TING
Ryan MCMICHAEL
Noor Abdelmaksoud
Kai ZHOU
Mohammad Umar PIRACHA
Original Assignee
Zoox, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zoox, Inc. filed Critical Zoox, Inc.
Publication of WO2024081230A1 publication Critical patent/WO2024081230A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection

Definitions

  • lidar systems that use lasers to emit pulses into an environment and sensors to detect pulses that are reflected back from the surfaces of objects in the environment. Such reflected pulses may, in turn, be used to perform detection of objects, such as vehicles, pedestrians, and bicycles, in an environment.
  • Lidar sensors generally measure the distance from a lidar device to the surface of an object by transmitting a light pulse and receiving a reflection of the light pulse from the surface of the object, which may be read by a sensor of the lidar device. The sensor may generate a signal based on light pulses incident on the sensor.
  • Lidar return signals may be attributable to reflections of obj ects, but portions of lidar signals also may be attributable to noise and/or other interfering signals (e.g., from the lidar device itself or from an external source).
  • lidar systems may be used to detect objects in driving environments, analyze the objects, and/or determine routes for the vehicle to navigate through the environment safely and efficiently.
  • lidar noise and interference may cause errors in the analysis of lidar data, such as false-positive object detections.
  • Such lidar data analysis errors can present challenges to safely and efficiently navigating environments.
  • FIG. 1 is a pictorial flow diagram illustrating an example technique of determining and calibrating for lidar background noise, in accordance with one or more examples of the disclosure.
  • FIGS. 2 and 3 illustrate example environments including lidar background noise caused by solar radiation, and the corresponding effect on the lidar reflectivity 7 data, in accordance w ith one or more examples of the disclosure.
  • FIG. 4 depicts an example graph representing the return signals associated with a number of lidar pulses emitted by a lidar system, in accordance with one or more examples of the disclosure.
  • FIGS. 5A and 5B depict example graphs representing a technique for determining lidar background noise associated with a lidar pulse, in accordance with one or more examples of the disclosure.
  • FIG. 6 depicts another example graph representing a technique for determining lidar background noise associated with a lidar pulse, in accordance with one or more examples of the disclosure.
  • FIG. 7 depicts an example graph representing techniques for determining power levels and/or corresponding times associated with a lidar pulse, in accordance with one or more examples of the disclosure.
  • FIGS. 8A and 8B depict example graphs representing a technique for determining background noise based on comparing ADC samples representing accumulated energy from different return lidar pulses having different transmit powers, in accordance with one or more examples of the disclosure.
  • FIG. 9 depicts an example environment including lidar background noise caused by solar radiation, and the corresponding calibrated lidar reflectivity data, in accordance with one or more examples of the disclosure.
  • FIG. 10 depicts an example environment including an example autonomous vehicle operating in the environment, in accordance w ith examples of the disclosure.
  • FIG. 11 is a block diagram of an example system for implementing various techniques described herein.
  • a lidar system may determine lidar background noise by performing a first energy data sampling (e.g., which may be performed using an analog-to-digital converter (ADC)). In at least some examples, such sampling may be performed during a time window associated with a particular lidar pulse.
  • ADC analog-to-digital converter
  • the first ADC sampling may be performed over an estimated round-trip travel time associated with a laser pulse emitted by the lidar system.
  • the first ADC sample may be compared to a second ADC sample associated with a different time period (or multiple time periods), such as during a “dwell time'’ between lidar pulses.
  • Various techniques can be used to analyze and compare the data from the ADC samples, to determine a lidar background noise level associated with the environment.
  • the lidar reflectivity data output by the lidar system including determinations and/or attributes of lidar points, may be modified to calibrate the data based on the background noise in the environment.
  • the lidar system itself may be reconfigured based on the background noise in the environment. For instance, the laser transmit power, aperture size, optical gain, and/or other features of the lidar system may be modified to calibrate out the background noise and/or improve the signal -to-noise ratio (SNR) of the lidar data.
  • SNR signal -to-noise ratio
  • the lidar system may emit periodic laser pulses (or lidar pulses) into the surrounding environment.
  • Lidar pulses emitted by the lidar system may include various properties, such as intensity, power, polarization, phase, coherence, spectral content, modulation, spatial shape, temporal shape, and other lidar pulse properties, some or all of which may be controlled by the lidar system.
  • the lidar system also may include lidar sensors (e.g., photodetectors) configured to detect returning lidar pulses that were reflected off of one or more surfaces and/or objects in the environment.
  • the systems and techniques described herein for determining and calibrating out lidar background noise can be implemented within autonomous vehicles.
  • an autonomous vehicle may use the lidar system to detect objects proximate to the vehicle in the environment (e.g., other vehicles, pedestrians, bicycles, road debris, traffic signs, etc.).
  • objects proximate to the vehicle in the environment e.g., other vehicles, pedestrians, bicycles, road debris, traffic signs, etc.
  • the techniques described herein can be applied to a variety of systems and/or platforms including lidar capabilities (e g., sensor systems, robotic platforms, inspection systems, remote security systems, machine vision platforms, etc.) and are not limited to autonomous vehicles.
  • lidar systems may apply to various types and configurations of lidar systems, certain techniques may provide particular advantages for lidar systems with high-sensitivity detectors and/or low-speed analog-to-digital converters (ADCs) to receive and process reflectivity data.
  • photodetectors such as silicon photomultiplier (SiPM) detectors using single-photon avalanche diodes (SPAD), as well as other high-sensitivity photodetectors including photodiodes and/or avalanche photodiodes (APDs) may be highly sensitive to lidar background noise (e.g., light and/or energy reflections or emissions) within the environment.
  • the solar radiation emitted from sunlit surfaces may have significantly greater intensities than comparable shaded areas, causing lidar background noise within the sunlit surfaces.
  • the lidar background noise from solar radiation and/or other sources may present technical challenges for the lidar system in attempting to distinguish the background noise from the lidar pulse data reflected by surfaces and objects in the environment. These technical challenges are made more difficult when the lidar system uses a lower transmit power, allowing the background noise to obscure the lidar reflection data and decreasing the signal-to-noise ratio (SNR).
  • SNR signal-to-noise ratio
  • lidar systems that include a relatively low-speed ADC can present additional technical challenges for detecting lidar background noise.
  • an ADC may be used to generate the lidar sensor data output, based on the data received from the optical sensors.
  • photodetectors, photodiodes, and/or other optical sensors may generate analog signals that can be provided to the ADC to convert into a digital output signal for additional processing.
  • ADCs may use integrators and other similar techniques to accumulate the amount of light received via the optical sensors over a period of time.
  • the light data may be calculated (e.g., accumulated and/or integrated) over relatively longer time ranges, which can obscure the lidar background noise and cause difficulties in determining the precise peak power level and/or the peak time for a lidar return pulse signal.
  • the lidar system may be unable to distinguish how much of the accumulated light data is the result of the reflected lidar pulse and how much is the result of background noise.
  • Such low-speed ADC’s may be advantageous over high-speed ADC’s, however, such as with respect to cost, energy consumption, etc.
  • the lidar systems described herein may perform ADC sampling operations (e.g., accumulating and/or integrating the light data received from the optical sensors) over multiple time periods relative to the emission of a lidar pulse.
  • the lidar system may determine an estimated round-trip time for a lidar pulse (e.g., based on the range of the lidar system, the lidar pulse characteristics, the environment characteristics, etc.), and may determine a first time window during which the reflected light from the lidar pulse is likely to be received by the optical sensors.
  • the lidar system also may determine one or more additional time windows during the “dwell time” (e.g., time between the lidar pulses) during which reflected light from the lidar pulse is less likely to be received.
  • the lidar system may perform ADC sampling(s) during both time windows, and may compare the output from the first time window associated with the lidar pulse (e.g., which may include reflected light from the lidar pulse and lidar background noise) with the additional time window(s) during the dwell time (e.g., which may include primary lidar background noise). By comparing the light energy accumulated during the different ADC samplings, the lidar system may determine the power level of the lidar background noise received from the environment, and may calibrate out the lidar background noise to more accurately determine the power level of the light reflected from the lidar pulse. [0019] To determine the lidar background noise level, the ADC samples may be analyzed and/or compared using various different techniques.
  • the lidar system may use the ADC sample performed during the dwell time between lidar pulses to determine the background noise level at a particular location in the environment.
  • a rising-edge meeting or surpassing a threshold may be used to initiate flagging of an ADC sample as being associated with a pulse, whereas a falling edge meeting the threshold may end flagging samples with the pulse.
  • estimated transit time and firing time may be used (as will be discussed herein) for determining those samples associated with the lidar pulse and those associated with the dwell time.
  • the output from the ADC sample may represent the accumulated energy received by the optical sensors during the sampling time window.
  • the lidar system may use the accumulated energy output and the duration of the time window to determine the background noise level during the dwell time (e.g., by estimating an average irradiance or, otherwise, an amount of energy received per unit time).
  • the lidar system may perform multiple ADC samplings during the dwell time associated with a lidar pulse (e.g., just before or just after the end of the estimated round-trip time of the lidar pulse), and may average the output readings from the multiple ADC samples to estimate the background noise level.
  • the lidar system may subtract the background noise level from the ADC sample associated with the lidar pulse (e.g., the accumulated light received during the estimated round-trip time), to determine the peak power level and/or overall energy returned from the lidar return pulse signal.
  • the background noise level e.g., the accumulated light received during the estimated round-trip time
  • the lidar system may perform ADC samplings over uniform time interv als and/or time intervals having different durations.
  • ADC samples are associated with different time durations
  • the output readings of the ADC samples may be adjusted (e.g., scaled) based on their time durations to provide consistent readings.
  • both the time duration associated w ith the ADC samples and the number of ADC samples that may be generated during the dwell time may be constrained by the characteristics of the ADC and the lidar system.
  • a lidar system may use an ADC with a sampling rate of 2 MHz, which is capable of outputting a maximum of one sample every 500 ns.
  • the lidar system fires 355,000 lasers per second, one laser will be fired approximately every 2,812 ns. Assuming a range of 50 m for the lidar system, the estimated maximum round-trip time for each laser would be 334 ns. Therefore, in this example lidar system, there is 2,478 ns (2,812 - 334 ns) of dw ell time (e.g., the time between the lidar return pulse signal and the firing of the next pulse) associated with each lidar pulse. During each dwell time, the ADC in this example is capable of outputting five samples.
  • the laser or detector may be turned off except for a shorter period of time and the integrated value of the ADC may be averaged with respect to the amount of time the detector is on (as opposed to the length of the ADC window).
  • time periods may be sampled immediately before a subsequent pulse to ensure another return isn't contributing to the estimated background.
  • the lidar system may use the lidar system capabilities and configuration data to predetermine the start time and duration of the ADC sampling window' associated with a lidar pulse. For instance, when the firing time of a lidar pulse and the estimated range associated with that lidar pulse are known, the lidar system may compute the estimated (maximum) round-trip time for the lidar pulse and determine the ADC sampling window for the lidar pulse as the time window between the firing time and the estimated round-trip time. In other cases, the lidar system need not predetermine which of the ADC samplings is associated with a lidar pulse and which are associated with the dwell time between lidar pulse. For instance, the lidar system may perform a number of ADC samplings close together in time near a lidar pulse, and may compare the sizes of the output readings of the ADC samples to determine which are associated with the lidar pulse and which are associated with the dwell time.
  • Some lidar systems may use time-to-digital converters (TDCs) instead of, or in addition, to ADCs to evaluate the data received from the optical sensors and output digital sensor data.
  • TDCs time-to-digital converters
  • lidar systems may use TDCs to implement power level thresholds based on the analog light data received from the optical sensors. When a power level threshold is triggered the TDC may output the corresponding timing information.
  • the lidar system may use a combination of TDC thresholds to determine the start and end times of a lidar return pulse signal, and/or the peak power level or peak time associated with a lidar pulse.
  • the lidar system may initiate ADC samplings based on the determined times to determine the peak power level of the return signal. Additional TDC thresholds may be used to determine the dwell time associated with the lidar pulse, and the lidar system may initiate ADC samplings within the determined dwell time to determine the background noise power level.
  • the lidar sy stem may use the background noise level to modify/ calibrate the lidar reflectivity data associated with the location.
  • the lidar system may use the reflectivity data associated with a lidar pulse to determine w hether or not to output a lidar point based on the pulse, and to determine the attributes of the lidar point.
  • the lidar system may reduce false positive and false negative lidar point determinations, and may provide more accurate power level data associated with the determined lidar points.
  • the calibration of the lidar reflectivity data based on the background noise also may improve the signal-to-noise ratio (SNR), by reducing some or all of the background noise in the reflectivity data.
  • the lidar system also may be reconfigured based on the determined levels of background noise in the environment. For instance, the laser transmit power, aperture size, optical gain, and/or other features of the lidar system may be modified to reduce background noise and/or improve the signal-to-noise ratio (SNR) of the lidar data. Any of these characteristics of the lidar system may be modified, individually or in any combination, in response to the level of background noise in the environment.
  • the lidar system may increase transmit power to better distinguish reflectivity lidar data from background noise. Additionally or alternatively, the lidar system may reduce the aperture size in high background noise regions. In contrast, in regions with lower levels of lidar background noise, the lidar system may decrease transmit power and/or increase the aperture size to save energy and/or improve the SNR of the lidar data.
  • lidar system may compute a different background noise level for each different laser and/or each laser pulse emitted by the system.
  • the lidar system may use these techniques over multiple laser pulses emitted toward the same location to determine the average background noise of the location. For instance, when the lidar system is stationary (e.g., mounted to a fixed location in a security system), it may determine the average background noise of a location based on multiple laser pulses emitted from the same laser and/or in the same (or a substantially similar) direction. When the lidar system itself is not stationary (e.g., a lidar on an autonomous vehicle), it may determine the estimated distance between the target locations of different lidar pulses (e.g., based on the movement of the autonomous vehicle between the laser emissions, the lidar range, etc.).
  • the lidar system then may use a distance threshold, time threshold, and/or threshold number of pulses to determine when multiple pulses are targeted to the same region and are close enough to be used to determine the average background noise of the region.
  • lidar returns may be selected (or otherwise determined) which are associated with a same or similar target region despite movement of the sensor (e.g., rotations) and/or system to which it is mounted (e.g., the vehicle).
  • the various systems and techniques described herein may be directed to determining and calibrating out lidar background noise to improve the quality of lidar data used to generate surface detection data, object detection data, and/or other data that may be used by a vehicle, such as an autonomous vehicle, to more accurately identify objects in an environment. Using this improved data, such a vehicle may generate safer and more efficient trajectories for use in navigating through an environment.
  • the systems and techniques described herein may also, or instead, enable a vehicle, such as an autonomous vehicle, to more accurately predict trajectories of other vehicles and/or mobile objects in an environment and therefore operate more safely in the environment using such predictions.
  • the systems and techniques described herein can utilize data structures containing surface detection data and/or object detection data based on the disclosed improved analysis of returned lidar pulses to more accurately and efficiently determine the locations of objects in an environment and the proximity’ of an autonomous vehicle to such objects.
  • the examples described herein may result in increased certainty and accuracy of object detections, thereby allowing an autonomous vehicle to generate more accurate and/or safer trajectories for the autonomous vehicle to traverse in the environment.
  • techniques described herein may increase the reliability of the determination of locations, dimensions, and/or other physical parameters of objects in the environment, reducing the likelihood of failing to detect or inaccurately detecting an object. That is, techniques described herein provide a technological improvement over existing object detection, classification, tracking, and/or navigation technology'. In addition to improving the accuracy of object detections and determinations of the size, shape, and location of such objects, the systems and techniques described herein can provide a smoother ride and improve safety outcomes by, for example, more accurately providing safe passage to an intended destination through an environment that is also occupied by one or more objects. [0029] The techniques described herein may also improve the operation of computing systems and increase resource utilization efficiency.
  • vehicle computing systems may perform object detection more efficiently using the techniques described herein, because the disclosed examples for calibrating out background noise may improve the signal-to-noise ratio (SRN) of the lidar data.
  • SRN signal-to-noise ratio
  • these techniques may permit the lidar system to use lower transmit powers for lidar pulses, and/or may require the processing of fewer returned lidar pulses and/or associated data than would be required using conventional techniques.
  • the systems and techniques described herein can be implemented in several ways. Example implementations are provided below with reference to the following figures. Although discussed in the context of an autonomous vehicle, the techniques described herein can be applied to a variety of systems (e.g., a sensor system or a robotic platform) and are not limited to autonomous vehicles. For example, the techniques described herein may be applied to semi-autonomous and/or manually operated vehicles. In another example, the techniques can be utilized in an aviation or nautical context, or in any system involving objects or entities having dimensions and/or other physical parameters that may not be known to the system.
  • the disclosed systems and techniques may include processing using various types of components and various types of data and data structures, including, but not limited to, various types of image data or sensor data (e.g., stereo cameras, time-of-flight data, radar data, sonar data, and the like). Additionally, the techniques described herein can be used with real data (e.g.. captured using sensor(s)), simulated data (e.g., generated by a simulator), or any combination of the two.
  • FIG. 1 is a pictorial flow diagram of an example process 100 for determining and calibrating out background noise within reflected lidar pulses in an environment.
  • one or more operations of the process 100 may be implemented by a lidar system 102.
  • the lidar system 102 may be implemented within a vehicle computing system, such as by using one or more of the components and systems illustrated in FIG. 10 and described below.
  • one or more components and systems can include those associated with the one or more sensor systems 1010, the perception component 1022, and/or the planning component more components and systems can include those associated with the one or more sensor systems 1106 and/or the perception component 1122 of the vehicle 1102 illustrated in FIG. 11.
  • the one or more operations of the process 100 may also, or instead, be performed by a remote system in communication with a vehicle, such as the perception component 1140 of the computing device(s) 1134 illustrated in FIG. 11. Such processes may also, in turn, be performed by the device itself (e.g., using onboard electronics) such that a standalone device may produce such signals without the need for additional computational resources.
  • the one or more operations of the process 100 may be performed by a combination of a remote system and a vehicle computing systems.
  • the process 100 is not limited to being performed by such components and systems, and the components and systems of FIGS. 10 and 11 are not limited to performing the process 100.
  • a lidar system 102 which may be associated with an autonomous vehicle or may be implemented in a different system/environment, may emit a signal into an environment.
  • the lidar system 102 may include one or more lidar emitters and one or more lidar sensors (e.g., photodetectors).
  • the environment into which the lidar pulse is emitted may include one or more other obj ects that a vehicle computing system configured at the autonomous vehicle may detect.
  • the lidar system 102 may be implemented within a vehicle computing system of an autonomous vehicle that includes one or more sensors configured to detect stationary objects (e.g., buildings, road markings, signs) and moving objects (e.g., people, bicycles, other vehicles) in the environment.
  • the lidar system 102 may be implemented within an autonomous vehicle traversing a driving environment.
  • the lidar system 102 emits periodic lidar pulses in different directions from the vehicle, receives return signals from the lidar pulses, and analyzes the return signals to detect various objects 108-112 in the environment.
  • the lidar system 102 may emit any number of lidar pulses (e.g., based on the number of lasers and firing rates) at various angles/directions into the environment, the operations below' may refer to receiving and analyzing the return single from a single lidar pulse.
  • the lidar system 102 performs a first ADC sampling during a first time window associated with a lidar pulse.
  • the lidar system 102 may determine the first time window as the period of time during which a significant amount of the reflected light from the laser pulse will be received by the lidar sensors.
  • a first sampling time window 118 corresponds to the time during which most (or all) of the return signal from a lidar pulse is captured by the lidar sensors.
  • the lidar system 102 may determine an estimated maximum round-trip time associated with the lidar pulse emitted in operation 104.
  • Different lidar pulses emitted by the same lidar system may have different estimated maximum round-trip times, based on the characteristics of the laser pulse (e.g., intensity, frequency, etc.) and the characteristics of the environment (e.g., distance to a surface). For instance, if the likely maximum range of the emitted pulse is 50 meters, then the estimated maximum round-trip time for the pulse would be 334 ns (e.g., 50 m * 2 / c, where c equals the speed of light).
  • the lidar system 102 may determine a start time of the time window corresponding to the start time of the lidar pulse (or shortly thereafter) and the end time of the time window 7 corresponding to the start time plus the estimated maximum round-trip time for the pulse.
  • the lidar system 102 may perform an ADC sampling by computing the accumulated light received by the lidar sensors (e.g., photodetectors) associated with the lidar pulse, over the sampling time window 7 .
  • the ADC may use an integrator to determine the accumulated light received during the first time window, and may output the accumulated light data (e.g., as a power or energy reading) as the ADC sample for the first time window'.
  • the lidar system 102 performs a second ADC sampling during a second time window that is different from the first time window 7 .
  • the lidar system 102 may determine the second time window as the period of time during which little or none of the reflected light from the laser pulse will be received by the lidar sensors.
  • a second sampling time window 124 corresponds to the time after the first time window 7 during which little (or none) of the return signal from the lidar pulse is captured by the lidar sensors.
  • the lidar system 102 may determine the second time window as a period of time after the first time window, during the “dwell time” between tw o lidar pulses.
  • the dwell time associated with a lidar pulse may be the time period between the return signal of the lidar pulse (e.g., after the estimated maximum round-trip time) and the firing of the next lidar pulse.
  • the second time window in operation 120 may be any period of time during the dwell time associated with a lidar pulse.
  • the second time window may be the same length as the first time window, which may simplify the comparison of the ADC samplings. In other cases, the second time window may be longer or shorter than the first time window.
  • the lidar system 102 may perform multiple additional ADC samplings during the dwell time of a lidar pulse in operation 120, and may average the output readings from the multiple ADC samples to estimate the background noise level.
  • the lidar system 102 may use similar or identical techniques to those used to perform the ADC sampling in operation 114.
  • the ADC within the lidar system 102 may use an integrator to accumulate the light received by the lidar sensors (e.g., photodetectors) over the second time window, and may output the accumulated light data as the power or energy reading for the second time window.
  • the lidar system 102 may determine the level of lidar background noise based on the first ADC sampling performed in operation 114 and the second ADC sampling performed in operation 120. The lidar system 102 then may calibrate out the determined background noise from the lidar reflectivity data. For example, based on the second ADC sample (and/or additional ADC samples captured during the dwell time associated with a lidar pulse), the lidar system 102 may determine an average background noise level. The lidar system 102 then may subtract the average background noise level (e.g., an energy or power value) from the first ADC sample to determine the reflectivity data associated with the lidar pulse.
  • the average background noise level e.g., an energy or power value
  • the resulting lidar reflectivity data may provide improved and more accurate signal characteristics, and an increased signal-to-noise ratio (SNR) for the lidar data.
  • Box 128 in this example represents the improved and more accurate lidar data for an environment, after calibrating out the lidar background noise.
  • the operations described in FIG. 1 may be performed for a single lidar pulse, to determine the level of lidar background noise at the location where the pulse was emitted.
  • groups of multiple lidar pulses emitted to the same location/region of the environment may be analyzed together to determine an average background noise level for the region.
  • different locations/regions in the environment may have different levels of background noise (e.g., shaded versus sunlit regions, regions having different surface materials, etc.).
  • the level of the background noise for a particular location/region may change in a relatively short period of time, for instance, when a sunlit region becomes shaded or vice versa.
  • the operations described in FIG. 1 may be performed separately for each different location in the environment, and/or may be performed periodically in the same region to determine updated background noise levels.
  • this example depicts performing the “pulse time” ADC sampling during the first sampling window 118 before the “dwell time” ADC sampling during the second sampling window 124
  • the dwell time ADC sampling can be performed prior to the pulse time ADC sample.
  • techniques described herein may use one or more dwell time ADC sampling windows to measure lidar background noise, without necessarily performing any pulse time ADC sampling.
  • the lidar data depicted in FIGS. 2 and 3 may represent lidar data generated without using the various techniques described herein for determining and calibrating out lidar background noise.
  • the lidar data shown in FIGS. 2 and 3 may include a number of inaccuracies (e.g., false positives and/or false negative lidar detections), caused by the regions of solar radiation in the environment.
  • FIG. 2 depicts an example driving environment 200 associated with an autonomous vehicle (or other sensor system).
  • image 202 represents a visual image captured by a camera associated with the autonomous vehicle while traversing the driving environment.
  • Region 204 within the image 202 represents a shaded region caused by the shading from one or more buildings or trees in the environment.
  • Lidar data 206 shows a top-down rendering of the lidar data generated by a lidar system based on the driving environment 200.
  • Region 208 within the lidar data 206 corresponds to region 204 within the image 202.
  • the colors purple and dark blue indicate low er reflectivity, while red and yellow indicate higher reflectivity.
  • the lidar data region 208 shows a lower level of reflectivity than other similar road surfaces in the environment 200.
  • This example illustrates the shaded areas (e.g.. region 204) degrade the measured reflectivity of the road surface.
  • FIG. 3 depicts another example driving environment 300 associated with an autonomous vehicle (or other sensor system).
  • image 302 represents a visual image captured by a camera associated with the autonomous vehicle while traversing the driving environment 300.
  • Region 304 within the image 302 represents a patch of sunlight passing between two buildings within an otherwise shaded area of the street.
  • Lidar data 306 shows a top-do w n rendering of the lidar data generated by a lidar system based on the driving environment 300.
  • Region 308 within the lidar data 306 corresponds to region 304 within the image 302.
  • the patch of sunlight in region 304 causes a difference in reflectivity compared to the surrounding shaded areas.
  • both the examples in FIG. 2 and FIG. 3 illustrate that, when the background noise due to solar radiation is not calibrated out, the presence of various shaded and sunlit areas in the environment may cause differences in reflectivity which can degrade or distort the measured reflectivity of the surfaces in the environment.
  • background noise from solar radiation may result in false positive lidar point detections (e g., region 308 of FIG. 3) and/or false negatives where lidar points based on reflectivity are not detected (e.g.. in region 208 of FIG. 2).
  • FIG. 4 depicts an example graph 400 representing the return signals (or lidar return pulses) associated with two lidar pulses (which also may be referred to as laser firings) emitted by a lidar system 102.
  • graph 400 depicts a first lidar pulse 402 and a second lidar pulse 404 that have been fired in a lidar system 102, as well as the subsequent first return signal 406 associated with the first lidar pulse and the second return signal 408 associated with the second lidar pulse.
  • the first and second lidar pulses in this example may represent laser pulses emitted from tw o different lasers of the lidar system 102, or may represent consecutive pulses emitted by the same laser at different times.
  • the lidar sensors may receive a lidar return pulse 406 associated with the first laser firing.
  • a laser return signal (which also may be referred to as the lidar return pulse) may generally correspond to the period of time during which most (or all) of the reflected light from the lidar pulse is received by the lidar sensors.
  • the laser return signal may correspond to a period of time starting at the firing of the laser and ending after the estimated maximum round-trip time of the lidar pulse.
  • the lidar system 102 may estimate the maximum round-trip time of a lidar pulse based on a number of factors, such as the transmit power of the lidar pulse, the wavelength of the laser, and/or the characteristics of the location to which the laser is directed.
  • the lidar sensors may experience a period of “dwell time” before the next lidar pulse 404 is fired.
  • the dwell time associated with a lidar pulse little (or none) of the reflected light from the lidar pulse may be received by the lidar sensors.
  • the lidar sensors may receive light from various background noise light sources (e.g., reflected sunlight, object surfaces emitting heat, etc.).
  • the dwell time between the lidar pulses indicates a relatively consistent level of background noise (e.g., a power level greater than zero) which may be caused by solar radiation at the location where the laser is directed.
  • FIGS. 5A and 5B depict two example graphs 500 and 502 illustrating a technique for determining the lidar background noise level associated with a lidar pulse 504.
  • the lidar system 102 may determine a first time window 508 during the dwell time in between lidar pulses, and may perform a first ADC sampling during the first time window 508.
  • the time window 508 in this example may be selected as a time window' just prior to the firing of the lidar pulse 504.
  • the ADC of the lidar system 102 may determine the accumulated amount of light (e.g., using an integrator) received by the analog lidar sensor elements (e.g., photodetectors) over the sampling time window.
  • the analog lidar sensor elements e.g., photodetectors
  • Graph 500 depicts the first ADC sampling window 508, including shading indicating the accumulated light data received during the first sampling w indow.
  • the lidar sensors may receive none of the reflected light from the lidar pulse 504 (and none or a minimal amount of reflected light from any pulses previous to lidar pulse 504).
  • lidar system 102 may perform an additional ADC sampling during a second ADC sampling window 510.
  • the second ADC sampling window 510 is not associated with the dwell time before lidar pulse 504. Rather, the second ADC sampling window 510 may begin simultaneously with or shortly after the beginning of the of the lidar pulse 504.
  • the lidar system 102 may determine the end of the second ADC sampling time window 510 based on the estimated maximum round-trip time for the laser in the first lidar pulse 504.
  • the duration of the sampling time window 510 may be set for approximately 200 nanoseconds based on determining that most or all of the reflected light from the lidar pulse should return to the lidar sensor within that time duration.
  • the lidar system 102 may use longer sampling time windows (e.g., 300 nanosecond duration), and so on.
  • the lidar system 102 may determine “pulse time” ADC sampling time windows to correspond to the time duration when most or all of the return signal from the lidar pulse is likely to be received by the lidar sensors, and separate “dwell time” ADC sampling time windows to correspond to periods between lidar pulses, during which the lidar sensors may receive relatively little (or none) of the reflected light from the lidar pulse. For instance, in addition to the first ADC sampling time window 508, the lidar system 102 may determine additional “dwell time” sampling time windows using any time period between the end of the second ADC sampling time window 510 and the start of the next lidar pulse 506.
  • any ADC sampling window between the end of the second ADC sampling time window 510 and the start of the next lidar pulse 506 may be used as “dwell time” sampling window-s, during which the lidar sensors may receive light from background noise sources (e.g., solar radiation) but little or no light from previous reflected lidar pulses.
  • background noise sources e.g., solar radiation
  • the lidar system 102 may use the accumulated light data from the ADC samples of the first time window' 508 and the second time window 510 to compute the background noise level associated with the lidar pulse 504. For example, lidar system 102 may use the duration of the first ADC sampling time window 508, and the accumulated light received during the first ADC sample (e.g., the integral of the shaded region in graph 502), to determine an average background noise level. The lidar system 102 may then subtract the average background noise level determined from the first ADC sampling from the second ADC sampling shown in FIG. 5B to improve the accuracy of the reflectivity data for the lidar return pulse.
  • the total dwell time between lidar pulses 504 and 506 may be longer than the duration of a “dwell time’' sampling window (e.g., the first ADC sampling time window 508).
  • the lidar system 102 may determine a duration for dwell time ADC sampling windows that is longer than the duration of “pulse time” ADC sampling windows.
  • the ADC may integrate to determine the accumulated amount of light over the longer time period, and then divide by the longer time duration to determine the average background noise level.
  • the “dwell time” ADC sampling windows may be shorter in duration than the “pulse time” ADC sampling windows, in which case the lidar system be integrate to determine the accumulated amount of light over the shorter dwell time period, and then divide by the shorter time duration to determine the average background noise level.
  • the lidar system 102 may perform multiple ADC samplings during the dwell time associated with the lidar pulse 504.
  • the multiple ADC samplings may be averaged (or otherwise combined) to determine the background noise level to be calibrated out of the lidar return pulse.
  • an example graph 600 is shown illustrating another technique for determining the background noise associated with a lidar pulse.
  • the example lidar pulse(s) shown in FIG. 6 may be similar or identical to the lidar pulse(s) shown in FIGS. 5 A and 5B.
  • the lidar system 102 has determined that, based on the amount of dwell time between the end of the lidar return pulse and the next laser firing, that five ADC samples may be captured and evaluated during the dw ell time.
  • the ADC sample captured during the lidar return pulse e.g., ADC Sample 1
  • each of the ADC samples captured during the dwell time e.g., ADC Sample lathrough ADC Sample le
  • the ADC samplings performed during the return pulse and the dwell time may have different time window durations.
  • the lidar system 102 may determine the background noise based on the median of ADC Sample la through ADC Sample le.
  • FIG. 7 depicts an example graph 700 illustrating various techniques for determining particular power levels and/or corresponding particular times associated with a lidar pulse.
  • certain lidar systems may include time-to-digital converters (TDCs) configured to evaluate the power levels associated with lidar return pulses.
  • TDCs time-to-digital converters
  • a lidar system 102 may include TDCs instead of, or in addition to, ADCs.
  • TDCs may receive data from the optical sensors of the lidar system 102, and output digital sensor data representing the amount of light data received by the optical sensors.
  • TDCs may be used to implement power level thresholds based on the amount of analog light data received via the optical sensors.
  • graph 700 depicts two separate TDC power thresholds 702 and 704.
  • the TDC may output an indication of which TDC power threshold was crossed and the time at which the threshold was crossed.
  • Graph 700 depicts a return signal (a return lidar pulse) caused by a lidar pulse, followed by a dwell time associated with the lidar pulse.
  • the TDC may output a first measurement indicating time tl in response to the lidar return pulse signal crossing the first TDC threshold 702, a second measurement indicating time t2 in response to the lidar return pulse signal crossing the second TDC threshold 704, a third measurement indicating time t3 in response to the lidar return pulse signal crossing the second TDC threshold 704 again, and a fourth measurement indicating time t4 in response to the lidar return pulse signal crossing the first TDC threshold 702 again.
  • the lidar system 102 may use TDC power thresholds to determine any or all of the ADC sampling time windows described herein. For instance, in response to the lidar return pulse signal exceeding the first threshold 702 at time tl, and/or in response to the lidar return pulse signal exceeding the second threshold 704 at time t2, the lidar system 102 may initiate one or more ADC sampling windows to measure the reflectivity associated with the lidar return pulse. Additionally or alternatively, in response to the lidar return pulse signal falling below the second threshold 704 at time t3 and/or falling below the first threshold 702 at time t4, the lidar system 102 may end ADC sampling time windows associated with the lidar return pulse.
  • the lidar system 102 may use the TDC power thresholds to determine when the dwell time associated with a lidar pulse has begun (e.g.. in response to the lidar return pulse signal falling below the first threshold 702 at time t4), and may use the outputs of the TDC to trigger an ADC sampling to measure the light received during a time window within the dwell time.
  • TDC power thresholds may be used to determine a maximum power level (e.g., peak 706) associated with a return lidar pulse.
  • the lidar system 102 may subtract the background noise level from peak 706 of the lidar pulse to determine a calibrated peak value that represents the peak reflectivity associated with the lidar pulse (e.g., excluding the background noise).
  • FIGS. 8 A and 8B show two example graphs 800 and 802 illustrating additional techniques for determining the level of background noise associated with one or more lidar pulses.
  • FIG. 8A shows a first graph 800 depicting a first lidar pulse 804 emitted by a laser in a lidar system 102, as well as the lidar return pulse signal associated wi th the first lidar pulse.
  • FIG. 8B shows a second graph 802 depicting a second lidar pulse 806 emitted by a different laser (or the same laser) at a different time, and the lidar return pulse signal associated with the second lidar pulse.
  • the lidar system 102 also may perform an ADC sampling associated with each lidar pulse.
  • the lidar system 102 may use larger time windows for the ADC samplings, corresponding to the entire time period between consecutive lidar pulses.
  • shaded area 810 may represent the accumulated amount of light (e.g., determined using an integrator) detected by the optical sensors of the lidar system 102 between time tl (the firing of the first lidar pulse 804) and time t2 (the firing of the second lidar pulse 806).
  • the shaded area 812 may represent the accumulated amount of light detected by the optical sensors of the lidar system 102 between time 12 (the firing of the second lidar pulse 806) and time t3 (the firing of the third lidar pulse 808).
  • the first lidar pulse 804 and second lidar pulse 806 may be directed to the same location or a nearby location within the same region of the environment. Although they are depicted as consecutive pulses in this example, in other examples they may be non-consecutive.
  • the lidar system 102 may vary’ the transmit power between the first lidar pulse 804 and second lidar pulse 806, and then analyze and compare the ADC samples associated with the lidar pulses to determine the background noise level. As shown in this example, the transmit power of the first lidar pulse in FIG. 8A is lower than the transmit power of the second lidar pulse in FIG. 8B.
  • the size of the ADC sample (e.g., shaded region 810) associated with the first lidar pulse 804 may be smaller than the size of the ADC sample (e.g., shaded region 812) associated with the second lidar pulse 806. Because the lidar pulses are directed to the same location/region at nearly the same time, the lidar system 102 may assume the same background noise level associated with the ADC samples 810 and 812. Additionally, the lidar system 102 may assume the that the difference in the reflectivity of the lidar pulses 804 and 806 may be proportional to (or otherwise correlated with) the difference in the transmit power. Based on these assumptions, the lidar system 102 can compute the level of background noise associated with the lidar pulses, based on the ADC samples 810 and 812.
  • the lidar system 102 can compute the level of background noise associated with the lidar pulses, based on the ADC samples 810 and 812.
  • an additional or alternative technique may be used to derive the background irradiance based on similar location data.
  • pulses may be chosen in which no surface reflection is expected (e.g., as may be derived from using map data, a current field of view of the lidar sensor, and/or position/orientation of the sensor in the environment).
  • lidar channels associated with no return may be selected which are contemporaneous (or substantially - e.g., within some threshold amount of time) and/or proximate to those that do.
  • the relative difference (or other combination) of integrated received data may be used for calibration in those instances.
  • the laser may randomly (or with some pattern) refrain from firing.
  • the associated detector will be associated with background flux.
  • the background may be stored and subtracted from those subsequent lidar points pointing at a same target area (e.g.. as would have been observed by the original laser if it had been fired).
  • a dedicated sensor may be used to measure the background noise in addition to the sensor associated with the lidar system.
  • one or more such additional sensors may be directed to the same or similar target areas as those associated with the lidar.
  • FIG. 9 depicts another example driving environment 900 associated with an autonomous vehicle or other sensor system.
  • image 902 is depicted showing a visual image captured of the environment.
  • Regions 904 within the image 902 represent shaded regions on the road surface, while regions 906 represent unshaded regions on the same road surface. As described above, shaded and unshaded areas on the same surface may have different levels of lidar background noise.
  • the lidar system 102 may determine the background noise levels associated with the shaded regions 904, and may separately determine the background noise levels of the unshaded regions 906 on the road surface (as well as determining the lidar background noise levels for any other regions in the environment). [0064] After determining the background noise levels for the shaded areas and unshaded areas in the environment, the lidar system 102 may calibrate the lidar data to exclude the differences in background noise between areas on the same surface.
  • Lidar data 908 represents a rendering of the lidar data points generated by a lidar system 102 based on the example driving environment 900.
  • the differences in lidar background noise between the shaded regions 904 and unshaded regions 906 of the road surface have been calibrated out (e g., by subtracting the respective background noise levels from the reflectivity data for each region) to provide calibrated lidar data output.
  • the calibrated lidar data 908 does not include regions of degraded measured reflectivity (as shown in lidar data 206) or differences in reflectivity of the same surface based on solar radiation (as shown in lidar data 306).
  • FIG. 10 illustrates a perspective view 1000 of an environment in which a vehicle 1002 may be traveling.
  • the environment contain a number of objects, such as another vehicle 1004, a traffic signal 1006, and a tree 1008.
  • a vehicle computing system operating the vehicle 1002 may use one or more sensor systems 1010 to emit signals, receive reflected signals, and process such reflected signals to generate data that may be used in object detection operations.
  • the vehicle 1002 may be configured with one or more lidar sensor systems 1012.
  • the vehicle computing system may operate the lidar system 1012 to emit lidar pulses into the environment. These lidar pulses may be reflected back to the lidar system 1012 as return signals (or return pulses), which may reflected directly or indirectly from the surfaces of the various objects, providing the return signals that may be measured in the various ways described herein to determine and calibrate out the background lidar noise within the environment.
  • the processor(s) 1014 and memory 1016 of the sensor system 1010 may process return signals (e.g., return lidar pulses and/or background noise) to determine as described herein to determine background lidar noise levels and/or to calibrate the lidar data (e.g., lidar points and attributes thereof) based on the determined background noise.
  • the processor 1014 may generate data that may be provided to the perception component 1022 of the vehicle control system 1020 performing object detection operations.
  • the perception component 1022 may then provide surface detection data and/or object detection data to the planning component 1024 for traj ectory and route planning for the vehicle 1002.
  • the system 1100 can include a vehicle 1102.
  • the vehicle 1102 can include a vehicle computing device 1104 that may function as and/or perform the functions of a vehicle controller for the vehicle 1102.
  • the vehicle 1102 can also include one or more sensor systems 1106, one or more emitters 1108, one or more communication connections 1110, at least one direct connection 1112, and one or more drive systems 1114.
  • the vehicle computing device 1104 can include one or more processors 1116 and memory 1118 communicatively coupled with the one or more processors 1116.
  • the vehicle 1102 is an autonomous vehicle; however, the vehicle 1102 could be any other type of vehicle.
  • the memon' 1118 of the vehicle computing device 1104 stores a localization component 1120. a perception component 1122. a planning component 1124, one or more system controllers 1126, one or more maps 1128, and a prediction component 1130. Though depicted in FIG.
  • any one or more of the localization component 1120, the perception component 1122, the planning component 1124, the one or more system controllers 1126. the one or more maps 1128, and the prediction component 1130 can additionally, or alternatively, be accessible to the vehicle 1102 (e g., stored remotely).
  • the localization component 1120 can include functionality to receive data from the sensor system(s) 1106 to determine a position and/or orientation of the vehicle 1102 (e.g.. one or more of an x-, y-, z-position, roll, pitch, or yaw).
  • the localization component 1120 can include and/or request/receive a map of an environment and can continuously determine a location and/or orientation of the autonomous vehicle within the map.
  • the localization component 1120 can utilize SLAM (simultaneous localization and mapping), CLAMS (calibration, localization and mapping, simultaneously), relative SLAM, bundle adjustment, non-linear least squares optimization, or the like to receive image data.
  • the localization component 1 120 can provide data to various components of the vehicle 1102 to determine an initial position of an autonomous vehicle for generating a trajectory and/or for generating map data, as discussed herein.
  • the perception component 1122 can include functionality to perform object detection, segmentation, and/or classification.
  • the perception component 1122 may include functionality to analyze pulse data to determine whether return pulses are likely to be multipath return pulse or single reflection return pulses, as described herein.
  • the perception component 1122 can provide processed sensor data that indicates a presence of an entity that is proximate to the vehicle 1102 and/or a classification of the entity as an entity type (e.g., car, pedestrian, cyclist, animal, building, tree, road surface, curb, sidewalk, traffic signal, traffic light, car light, brake light, unknown).
  • entity type e.g., car, pedestrian, cyclist, animal, building, tree, road surface, curb, sidewalk, traffic signal, traffic light, car light, brake light, unknown.
  • the perception component 1122 can provide processed sensor data that indicates one or more characteristics associated with a detected entity (e.g., a tracked object) and/or the environment in which the entity is positioned.
  • the perception component 1122 may use the multichannel data structures as described herein, such as the multichannel data structures generated by the described deconvolution process, to generate processed sensor data.
  • characteristics associated with an entity' or object can include, but are not limited to, an x-position (global and/or local position), a y-position (global and/or local position), a z-position (global and/or local position), an orientation (e.g., a roll, pitch, yaw), an entity type (e.g., a classification), a velocity of the entity, an acceleration of the entity, an extent of the entity (size), etc.
  • entity characteristics may be represented in a multichannel data structure as described herein (e.g., a multichannel data structure generated as output of one or more deconvolution layers (e.g., learned deconvolutional upsampling decoding layer(s)) using a learned upsampling transformation).
  • Characteristics associated with the environment can include, but are not limited to, a presence of another entity in the environment, a state of another entity in the environment, a time of day, a day of a week, a season, a weather condition, an indication of darkness/light, etc.
  • the perception component 1122 can provide processed return pulse data as described herein.
  • the planning component 1124 can determine a path for the vehicle 1102 to follow to traverse through an environment.
  • the planning component 1124 can determine various routes and trajectories and various levels of detail.
  • the planning component 1 124 can determine a route (e.g., planned route) to travel from a first location (e.g., a current location) to a second location (e.g., a target location).
  • a route can be a sequence of waypoints for travelling between two locations.
  • waypoints include streets, intersections, global positioning system (GPS) coordinates, etc.
  • the planning component 1124 can generate an instruction for guiding the autonomous vehicle along at least a portion of the route from the first location to the second location. In at least one example, the planning component 1124 can determine how to guide the autonomous vehicle from a first waypoint in the sequence of waypoints to a second waypoint in the sequence of waypoints.
  • the instruction can be a trajectory, or a portion of a trajectory.
  • multiple trajectories can be substantially simultaneously generated (e.g., within technical tolerances) in accordance with a receding horizon technique, wherein one of the multiple trajectories is selected for the vehicle 1102 to navigate.
  • the vehicle computing device 1104 can include one or more system controllers 1126, which can be configured to control steering, propulsion, braking, safety, emitters, communication, and other systems of the vehicle 1102. These system controller(s) 1126 can communicate with and/or control corresponding systems of the drive system(s) 1114 and/or other components of the vehicle 1102.
  • the memory 1118 can further include one or more maps 1128 that can be used by the vehicle 1102 to navigate within the environment.
  • a map can be any number of data structures modeled in two dimensions, three dimensions, or N-dimensions that are capable of providing information about an environment, such as, but not limited to, topologies (such as intersections), streets, mountain ranges, roads, terrain, and the environment in general.
  • a map can include, but is not limited to: texture information (e.g., color information (e.g..).
  • a map can include a three-dimensional mesh of the environment.
  • the map can be stored in a tiled format, such that individual tiles of the map represent a discrete portion of an environment, and can be loaded into working memory as needed, as discussed herein.
  • the one or more maps 1 128 can include at least one map (e.g., images and/or a mesh).
  • the vehicle 1102 can be controlled based at least in part on the maps 1128. That is, the maps 1128 can be used in connection with the localization component 1120, the perception component 1122, and/or the planning component 1124 to determine a location of the vehicle 1102, identify objects in an environment, and/or generate routes and/or trajectories to navigate within an environment.
  • the one or more maps 1128 can be stored on a remote computing device(s) (such as the computing device(s) 1134) accessible via network(s) 1132.
  • multiple maps 1128 can be stored based on, for example, a characteristic (e.g., type of entity, time of day, day of week, season of the year). Storing multiple maps 1128 can have similar memory' requirements but increase the speed at which data in a map can be accessed.
  • the prediction component 1130 can generate predicted trajectories of objects in an environment. For example, the prediction component 1130 can generate one or more predicted trajectories for vehicles, pedestrians, animals, and the like within a threshold distance from the vehicle 1102. In some instances, the prediction component 1130 can measure a trace of an object and generate a trajectory for the object based on observed and predicted behavior. In some examples, the prediction component 1130 can use data and/or data structures based on return pulses as described herein to generate one or more predicted trajectories for various mobile objects in an environment. In some examples, the prediction component 1130 may be a sub-component of perception component 1122.
  • aspects of some or all of the components discussed herein can include any models, algorithms, and/or machine learning algorithms.
  • the components in the memory 11 18 can be implemented as a neural network.
  • the memory 1118 may include a deep tracking network that may be configured with a convolutional neural network (CNN) that may one or more convolution/deconvolution layers.
  • CNN convolutional neural network
  • An example neural network is an algorithm that passes input data through a series of connected layers to produce an output.
  • Individual layers in a neural network can also comprise another neural network or can comprise any number of layers, and such individual layers may convolutional, deconvolutional, and/or another type of layer.
  • a neural network can utilize machine learning, which can refer to a broad class of such algorithms in which an output is generated based on learned parameters.
  • machine learning algorithms can include, but are not limited to, regression algorithms (e g., ordinary least squares regression (OLSR), linear regression, logistic regression, stepwise regression, multivariate adaptive regression splines (MARS), locally estimated scatterplot smoothing (LOESS)), instance-based algorithms (e.g., ridge regression, least absolute shrinkage and selection operator (LASSO), elastic net, least-angle regression (LARS)), decisions tree algorithms (e.g., classification and regression tree (CART), iterative dichotomiser 3 (ID3), Chi-squared automatic interaction detection (CHAID), decision stump, conditional decision trees), Bayesian algorithms (e.g., naive Bayes, Gaussian naive Bayes, multinomial naive Bayes, average one-dependence estimators (AODE), Bayesian belief network (OLSR), linear regression, logistic regression, stepwise regression, multivariate adaptive regression splines (MARS), locally estimated scatterplot smoothing (LOESS)), instance-based algorithms (e.g
  • Deep Boltzmann Machine Deep Belief Networks
  • CNN Convolutional Neural Network
  • PCA Principal Component Analysis
  • PCR Principal Component Regression
  • PLSR Partial Least Squares Regression
  • MDS Sammon Mapping
  • LDA Linear Discriminant Analysis
  • MDA Mixture Discriminant Analysis
  • QDA Quadratic Discriminant Analysis
  • FDA Flexible Discriminant Analysis
  • Ensemble Algorithms e.g., Boosting, Bootstrapped Aggregation (Bagging), AdaBoost, Stacked Generalization (blending), Gradient Boosting Machines (GBM), Gradient Boosted Regression Trees (GBRT), Random Forest), SVM (support vector machine), supervised learning, unsupervised learning, semi-supervised learning, etc.
  • Additional examples of architectures include neural networks such as ResNet50, ResNetlOl, VGG. DenseNet, PointNet, and the like.
  • the sensor system(s) 1106 can include radar sensors, ultrasonic transducers, sonar sensors, location sensors (e.g., GPS, compass), inertial sensors (e.g., inertial measurement units (IMUs), accelerometers, magnetometers, gyroscopes), cameras (e.g., RGB, IR, intensity, depth), time of flight sensors, microphones, wheel encoders, environment sensors (e.g., temperature sensors, humidity sensors, light sensors, pressure sensors), etc.
  • the sensor system(s) 1106 can include multiple instances of one or more of these or other types of sensors.
  • the camera sensors can include multiple cameras disposed at various locations about the exterior and/or interior of the vehicle 1102.
  • the sensor system(s) 1106 can provide input to the vehicle computing device 1104. Additionally, or alternatively, the sensor system(s) 1106 can send sensor data, via the one or more networks 1132, to the one or more computing device(s) at a particular frequency, after a lapse of a predetermined period of time, in near real-time, etc.
  • the sensor system(s) 1106 can include one or more lidar systems, such as one or more monostatic lidar systems, bistatic lidar systems, rotational lidar systems, solid state lidar systems, and/or flash lidar systems.
  • the sensor system(s) 1106 may also, or instead, include functionality to analyze the return signals of lidar pulses to determine and calibrate out background noise from the lidar reflectivity 7 data, as described herein.
  • a lidar system of the sensor system(s) 1106 may perform one or more of the operations described herein to perform ADC sampling during different time windows associated with a lidar return pulse and within the dwell time between return pulses, and to determine background noise levels based on a comparison/analysis of the ADC samples.
  • the vehicle 1102 can also include one or more emiters 1108 for emiting light (visible and/or non-visible) and/or sound.
  • the emiter(s) 1108 in an example include interior audio and visual emiters to communicate with passengers of the vehicle 1102.
  • interior emitters can include speakers, lights, signs, display screens, touch screens, haptic emiters (e.g., vibration and/or force feedback), mechanical actuators (e.g., seatbelt tensioners, seat positioners, headrest positioners), and the like.
  • the emiter(s) 1108 in this example may also include exterior emitters.
  • the exterior emiters in this example include lights to signal a direction of travel or other indicator of vehicle action (e.g., indicator lights, signs, light arrays), and one or more audio emiters (e.g., speakers, speaker arrays, horns) to audibly communicate with pedestrians or other nearby vehicles, one or more of which comprising acoustic beam steering technology.
  • the exterior emiters in this example may also, or instead, include non-visible light emiters such as infrared emiters, near-infrared emiters, and/or lidar emiters.
  • the vehicle 1102 can also include one or more communication connection(s) 1110 that enable communication between the vehicle 1102 and one or more other local or remote computing device(s).
  • the communication connection(s) 1110 can facilitate communication with other local computing device(s) on the vehicle 1102 and/or the drive system(s) 1114.
  • the communication connection(s) 1110 can allow the vehicle to communicate with other nearby computing device(s) (e.g., other nearby vehicles, traffic signals).
  • the communications connection(s) 1110 also enable the vehicle 1102 to communicate with a remote teleoperations computing device or other remote services.
  • the communications connection(s) 1110 can include physical and/or logical interfaces for connecting the vehicle computing device 1104 to another computing device or a network, such as network(s) 1132.
  • the communications connection(s) 1110 can enable Wi-Fi-based communication such as via frequencies defined by the IEEE 802.11 standards, short range wireless frequencies such as Bluetooth, cellular communication (e.g., 2G. 3G, 4G, 4G LTE. 5G) or any suitable wired or wireless communications protocol that enables the respective computing device to interface with the other computing device(s).
  • the vehicle 1102 can include one or more drive systems 1114.
  • the vehicle 1102 can have a single drive system 1114.
  • individual drive systems 1114 can be positioned on opposite ends of the vehicle 1102 (e.g., the front and the rear).
  • the drive system(s) 1114 can include one or more sensor systems to detect conditions of the drive system(s) 1 114 and/or the surroundings of the vehicle 1102.
  • the sensor system(s) 1106 can include one or more wheel encoders (e.g., rotary' encoders) to sense rotation of the wheels of the drive systems, inertial sensors (e.g., inertial measurement units, accelerometers, gyroscopes, magnetometers) to measure orientation and acceleration of the drive system, cameras or other image sensors, ultrasonic sensors to acoustically detect objects in the surroundings of the drive system, lidar sensors, radar sensors, etc.
  • Some sensors, such as the wheel encoders can be unique to the drive system(s) 1114.
  • the sensor system(s) on the drive system(s) 1114 can overlap or supplement corresponding systems of the vehicle 1102 (e.g., sensor system(s) 1106).
  • the drive system(s) 1114 can include many of the vehicle systems, including a high voltage battery, a motor to propel the vehicle, an inverter to convert direct current from the battery into alternating cunent for use by other vehicle systems, a steering system including a steering motor and steering rack (which can be electric), a braking system including hydraulic or electric actuators, a suspension system including hydraulic and/or pneumatic components, a stability control system for distributing brake forces to mitigate loss of traction and maintain control, an HVAC system, lighting (e.g., lighting such as head/tail lights to illuminate an exterior surrounding of the vehicle), and one or more other systems (e.g., cooling system, safety systems, onboard charging system, other electrical components such as a DC/DC converter, a high voltage junction, a high voltage cable, charging system, charge port).
  • a DC/DC converter e.g., a DC/DC converter, a high voltage junction, a high voltage cable, charging system, charge port.
  • the drive system(s) 1114 can include a drive system controller which can receive and preprocess data from the sensor system(s) and to control operation of the various vehicle systems.
  • the drive system controller can include one or more processors and memory communicatively coupled with the one or more processors.
  • the memory can store one or more components to perform various functionalities of the drive system(s) 1 114.
  • the drive system(s) 11 14 may also include one or more communication connection(s) that enable communication by the respective drive system with one or more other local or remote computing device(s).
  • the direct connection 1112 can provide a physical interface to couple the one or more drive system(s) 1114 with the body of the vehicle 1102.
  • the direct connection 1112 can allow the transfer of energy, fluids, air, data, etc. between the drive system(s) 1114 and the vehicle.
  • the direct connection 11 12 can further releasably secure the drive system(s) 11 14 to the body of the vehicle 1102.
  • the vehicle 1102 can send sensor data to one or more computing device(s) 1134 via the network(s) 1132.
  • the vehicle 1102 can send raw sensor data to the computing device(s) 1134.
  • the vehicle 1 102 can send processed sensor data and/or representations of sensor data (e.g., data representing return pulses) to the computing device(s) 1134.
  • the vehicle 1102 can send sensor data to the computing device(s) 1134 at a particular frequency, after a lapse of a predetermined period of time, in near real-time, etc.
  • the vehicle 1 102 can send sensor data (raw or processed) to the computing device(s) 1134 as one or more log files.
  • the computing device(s) 1134 can include processor(s) 1136 and a memory 1138 storing a planning component 1142 and/or a perception component 1140.
  • the perception component 1140 can substantially correspond to the perception component 1122 and can include substantially similar functionality.
  • the planning component 1142 can substantially correspond to the planning component 1124 and can include substantially similar functionality.
  • the processor(s) 1116 of the vehicle 1102 and the processor(s) 1136 of the computing device(s) 1134 can be any suitable processor capable of executing instructions to process data and perform operations as described herein.
  • the processor(s) 1116 and 1136 can comprise one or more Central Processing Units (CPUs), Graphics Processing Units (GPUs), and/or any other device or portion of a device that processes electronic data to transform that electronic data into other electronic data that can be stored in registers and/or memory.
  • integrated circuits e.g., ASICs
  • gate arrays e.g., FPGAs
  • other hardware devices can also be considered processors in so far as they are configured to implement encoded instructions.
  • Memory 11 18 and 1138 are examples of non-transitory computer-readable media.
  • the memory 1118 and 1138 can store an operating system and one or more software applications, instructions, programs, and/or data to implement the methods described herein and the functions attributed to the various systems.
  • the memory can be implemented using any suitable memory technology, such as static random-access memory' (SRAM), synchronous dynamic RAM (SDRAM). nonvolatile/Flash-type memory, or any other type of memory capable of storing information.
  • SRAM static random-access memory'
  • SDRAM synchronous dynamic RAM
  • nonvolatile/Flash-type memory or any other type of memory capable of storing information.
  • the architectures, systems, and individual elements described herein can include many other logical, programmatic, and physical components, of which those shown in the accompanying figures are merely examples that are related to the discussion herein.
  • FIG. 11 is illustrated as a distributed system, in alternative examples, components of the vehicle 1102 can be associated with the computing device(s) 1134 and/or components of the computing device(s) 1134 can be associated with the vehicle 1102. That is, the vehicle 1102 can perform one or more of the functions associated with the computing device(s) 1134. and vice versa.
  • any one or more parameters may be adjusted based at least in part on the observed background including, but not limited to, transmit power, pulse duration, receiver integration time, receiver photosensitivity', lidar receiver aperture size, or any other parameter which may impact SNR, link budget, or performance.
  • a lidar system comprising: at least one laser; at least one photodetector; one or more processors; and one or more non-transitory computer-readable media storing computer-executable instructions that, when executed, cause the one or more processors to perform operations comprising: transmitting, via the laser, a lidar pulse into an environment, at a pulse transmission time; determining, based at least in part on the pulse transmission time, a first sampling time window associated with a return signal of the lidar pulse; receiving first light data from the environment during the first sampling time window ⁇ ; determining a second sampling time window within a dwell time betw een the first sampling time window and a second lidar pulse; receiving second light data from the environment during the second sampling time window; determining, based at least in part on the second light data, a lidar background noise level associated with the environment; and determining reflectivity data associated with the lidar pulse, based at least in part on the first light data and the lidar background noise level.
  • receiving the first light data during the first sampling time window comprises: sampling, using an analog-to-digital converter (ADC) of the lidar system, light received by the at least one photodetector during the first sampling time window; and accumulating, using an integrator of the lidar system, the sampled light received during the first sampling time window.
  • ADC analog-to-digital converter
  • determining the reflectivity data associated with the lidar pulse comprises: determining a peak power level associated with the lidar pulse; and subtracting the lidar background noise level from the peak power level.
  • determining the peak power level associated with the lidar pulse comprises: determining, using a time-to-digital converter (TDC) of the lidar system, a peak time associated with the lidar pulse; and determining, using an analog-to-digital converter (ADC) of the lidar system, and based at least in part on the peak time, a magnitude associated with the peak power level.
  • TDC time-to-digital converter
  • ADC analog-to-digital converter
  • determining the second sampling time window- is based at least in part on at least one of: a transmission powder associated with the lidar pulse; a range associated with the lidar system; or a transmission frequency associated with the lidar system.
  • a method compnsing determining a pulse transmission time associated with a lidar pulse transmitted by a lidar system into an environment; determining, based at least in part on the pulse transmission time, a sampling time window ; receiving light data from the environment during the sampling time window; determining, based at least in part on the light data, a lidar background noise level associated with the environment; and determining reflectivity data associated with the lidar pulse, based at least in part on the lidar background noise level.
  • receiving the light data during the sampling time window comprises: sampling, using an analog-to-digital converter (ADC) of the lidar system, light received by a photodetector during the sampling time window; and accumulating, using an integrator of the lidar system, the sampled light received during the sampling time window'.
  • ADC analog-to-digital converter
  • determining the reflectivity data associated with the lidar pulse comprises: determining a peak power level associated with the lidar pulse; and subtracting the lidar background noise level from the peak power level.
  • determining the peak power level associated with the lidar pulse comprises: determining, using atime-to-digital converter (TDC) of the lidar system, a peak time associated with the lidar pulse; and determining, using an analog-to-digital converter (ADC) of the lidar system, and based at least in part on the peak time, a magnitude associated with the peak power level.
  • TDC time-to-digital converter
  • ADC analog-to-digital converter
  • determining the sampling time window is based at least in part on at least one of: a transmission power associated with the lidar pulse; a range associated with the lidar system; or a transmission frequency associated with the lidar system.
  • N One or more non-transitory computer-readable media storing instructions executable by one or more processors, wherein the instructions, when executed, cause the one or more processors to perform operations comprising: determining a pulse transmission time associated with a lidar pulse transmitted by a lidar system into an environment; determining, based at least in part on the pulse transmission time, a sampling time window; receiving light data from the environment during the sampling time window; determining, based at least in part on the light data, a lidar background noise level associated with the environment; and determining reflectivity data associated with the lidar pulse, based at least in part on the lidar background noise level.
  • receiving the light data during the sampling time window comprises: sampling, using an analog-to-digital converter (ADC) of the lidar system, light received by a photodetector during the sampling time window; and accumulating, using an integrator of the lidar system, the sampled light received during the sampling time window.
  • ADC analog-to-digital converter
  • determining the reflectivity' data associated with the lidar pulse comprises: determining a peak power level associated with the lidar pulse; and subtracting the lidar background noise level from the peak power level.
  • determining the peak pow er level associated with the lidar pulse comprises: determining, using a time-to-digital converter (TDC) of the lidar system, a peak time associated with the lidar pulse; and determining, using an analog-to-digital converter (ADC) of the lidar system, and based at least in part on the peak time, a magnitude associated with the peak powder level.
  • TDC time-to-digital converter
  • ADC analog-to-digital converter
  • determining the sampling time window is based at least in part on at least one of: a transmission power associated with the lidar pulse; a range associated with the lidar system; or a transmission frequency associated with the lidar system.
  • T The one or more non-transitory computer-readable media of paragraph N, the operations further comprising: determining a second sampling time window associated with a second lidar pulse, wherein the lidar pulse has a first transmission power and the second lidar pulse has a second transmission power different from the first transmission power; and receiving second light data from the environment during the second sampling time window, wherein determining the lidar background noise level associated with the environment is based at least in part on the first transmission pow er, the light data, the second transmission pow er, and the second light data.
  • the components described herein represent instructions that may be stored in any type of computer-readable medium and may be implemented in software and/or hardware. All of the methods and processes described above may be embodied in, and fully automated via, software code modules and/or computer-executable instructions executed by one or more computers or processors, hardware, or some combination thereof. Some or all of the methods may alternatively be embodied in specialized computer hardware.
  • Conditional language such as, among others, ‘'may,” “could.” “may” or “might,” unless specifically stated otherwise, are understood within the context to present that certain examples include, while other examples do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that certain features, elements and/or steps are in any way required for one or more examples or that one or more examples necessarily include logic for deciding, with or without user input or prompting, whether certain features, elements and/or steps are included or are to be performed in any particular example.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente divulgation porte sur la détection et l'étalonnage de bruit à partir de données lidar, y compris le bruit provoqué par le rayonnement solaire et/ou d'autres sources de bruit de fond lidar. Un système lidar peut déterminer un niveau de bruit de fond pour un emplacement par échantillonnage à l'aide d'un convertisseur analogique-numérique (CAN) pendant une fenêtre temporelle associée à une impulsion lidar. L'échantillon de CAN peut être comparé à des échantillons de CAN supplémentaires effectués pendant des périodes de temps supplémentaires entre des impulsions lidar. Les échantillonnages de CAN peuvent être analysés pour déterminer le niveau de bruit de fond lidar de l'environnement et les données lidar peuvent être modifiées pour étalonner le bruit de fond. Dans certains exemples, le système lidar lui-même peut être reconfiguré sur la base du bruit de fond lidar, comportant la modification de la puissance d'émission laser, de la taille d'ouverture, du gain optique et/ou d'autres caractéristiques du système lidar pour étalonner le bruit de fond et/ou améliorer le rapport signal sur bruit (SNR) des données lidar.
PCT/US2023/034809 2022-10-11 2023-10-10 Détection et compensation de bruit de fond lidar WO2024081230A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263415177P 2022-10-11 2022-10-11
US63/415,177 2022-10-11

Publications (1)

Publication Number Publication Date
WO2024081230A1 true WO2024081230A1 (fr) 2024-04-18

Family

ID=90669989

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/034809 WO2024081230A1 (fr) 2022-10-11 2023-10-10 Détection et compensation de bruit de fond lidar

Country Status (1)

Country Link
WO (1) WO2024081230A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180188104A1 (en) * 2015-06-26 2018-07-05 Nec Corporation Signal detection device, signal detection method, and recording medium
US20200158835A1 (en) * 2018-11-16 2020-05-21 Genoptics Precision Biotechnologies Inc. Time-of-flight ranging sensor and time-of-flight ranging method
US20200174120A1 (en) * 2018-11-30 2020-06-04 Nxp B.V. Lidar system and method of operating the lidar system
CN111983586A (zh) * 2020-08-12 2020-11-24 深圳市镭神智能系统有限公司 一种光电探测器的控制方法、控制系统及激光雷达
WO2021026241A1 (fr) * 2019-08-05 2021-02-11 Ouster, Inc. Système de traitement pour mesures lidar

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180188104A1 (en) * 2015-06-26 2018-07-05 Nec Corporation Signal detection device, signal detection method, and recording medium
US20200158835A1 (en) * 2018-11-16 2020-05-21 Genoptics Precision Biotechnologies Inc. Time-of-flight ranging sensor and time-of-flight ranging method
US20200174120A1 (en) * 2018-11-30 2020-06-04 Nxp B.V. Lidar system and method of operating the lidar system
WO2021026241A1 (fr) * 2019-08-05 2021-02-11 Ouster, Inc. Système de traitement pour mesures lidar
CN111983586A (zh) * 2020-08-12 2020-11-24 深圳市镭神智能系统有限公司 一种光电探测器的控制方法、控制系统及激光雷达

Similar Documents

Publication Publication Date Title
US11740335B2 (en) Identifying and/or removing false positive detections from LIDAR sensor output
US11480686B2 (en) Identifying and/or removing false positive detections from lidar sensor output
US20200225672A1 (en) Occulsion aware planning and control
US11255958B2 (en) Recognizing radar reflections using velocity information
US20190384309A1 (en) Occlusion aware planning
US11965956B2 (en) Recognizing radar reflections using position information
US10830894B2 (en) Intensity and depth measurements in time-of-flight sensors
WO2019245982A1 (fr) Planification sensible à une occlusion
US11275673B1 (en) Simulated LiDAR data
US11320520B2 (en) Lidar peak detection using time-to-digital converter and multi-pixel photon counter for autonomous driving vehicles
US20210096263A1 (en) Power control of sensors using multiple exposures
CN115508851A (zh) 自主载具应用中的大气能见度的确定
WO2020176483A1 (fr) Reconnaissance de réflexions radar à l'aide d'informations de vitesse et de position
US11861857B2 (en) Determining pixels beyond nominal maximum sensor depth
US11994591B2 (en) Determining depth using multiple modulation frequencies
US11954877B2 (en) Depth dependent pixel filtering
US11726186B2 (en) Pixel filtering using multiple exposures
WO2024081230A1 (fr) Détection et compensation de bruit de fond lidar
US20230058731A1 (en) Determining occupancy using unobstructed sensor emissions
US11753042B1 (en) Active power control using neighboring pixels
WO2020197614A1 (fr) Identification et/ou élimination de détections positives fausses d'une sortie de capteur lidar
US20230358865A1 (en) Lidar range enhancement using pulse coding
US11561292B1 (en) Active power control of sensors
US20220373658A1 (en) Gmapd data normalization using bernoulli trials
US11675077B2 (en) Systems and methods for analyzing waveforms using pulse shape information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23877914

Country of ref document: EP

Kind code of ref document: A1