WO2021002912A1 - Interference mitigation for light detection and ranging - Google Patents

Interference mitigation for light detection and ranging Download PDF

Info

Publication number
WO2021002912A1
WO2021002912A1 PCT/US2020/026925 US2020026925W WO2021002912A1 WO 2021002912 A1 WO2021002912 A1 WO 2021002912A1 US 2020026925 W US2020026925 W US 2020026925W WO 2021002912 A1 WO2021002912 A1 WO 2021002912A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
target point
neighboring
pulses
pulse signal
Prior art date
Application number
PCT/US2020/026925
Other languages
French (fr)
Inventor
Matthew Rekow
Stephen Nestinger
Aaron Chen
Original Assignee
Velodyne Lidar Usa, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Velodyne Lidar Usa, Inc. filed Critical Velodyne Lidar Usa, Inc.
Priority to JP2022500148A priority Critical patent/JP7546647B2/en
Priority to MX2021016061A priority patent/MX2021016061A/en
Priority to EP20834220.4A priority patent/EP3973316A4/en
Priority to CA3144656A priority patent/CA3144656A1/en
Priority to KR1020227003050A priority patent/KR20220025872A/en
Priority to CN202080060860.8A priority patent/CN114270215A/en
Publication of WO2021002912A1 publication Critical patent/WO2021002912A1/en
Priority to IL289131A priority patent/IL289131B2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection
    • G01S7/4876Extracting wanted echo signals, e.g. pulse detection by removing unwanted signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves

Definitions

  • LIDAR Light Detection and Ranging
  • LIDAR systems have a wide range of applications including autonomous driving and aerial mapping of a surface. These applications may place a high priority on the security, accuracy and reliability of the operation.
  • the disclosed embodiments provide methods and systems to reduce or minimize signal interference when multiple LIDAR sensors are present, thereby generating reliable data for determining surface features of the environment.
  • FIG. 1 illustrates an example schematic diagram of a LIDAR sensor in accordance with the present technology.
  • FIG. 2 illustrates an example schematic diagram of a LIDAR system in accordance with the present technology.
  • FIG. 3 illustrates another example schematic diagram of a LIDAR system in accordance with the present technology.
  • FIG. 4 depicts an example pulse signal generated by a controller.
  • FIG. 5A illustrates an example interference pattern that appears in a point cloud.
  • FIG. 5B illustrates an example interference pattern that appears in a point cloud.
  • FIG. 6 illustrates an example dithered pulse signal in accordance with the present technology.
  • FIG. 7 illustrates an example interference pattern after dithering the pulse signal in accordance with the present technology.
  • FIG. 8 illustrates an example of a filtering process in accordance with the present technology.
  • FIG. 9 is a flowchart representation of a method for sensing an object in an external environment using a light detection and ranging device in accordance with the present technology.
  • FIG. 10 is a block diagram illustrating an example of the architecture for a computer system or other control device that can he utilized to implement various portions of the presently- disclosed technology.
  • LID AR systems operate by transmitting a series of light pulses that reflect off objects.
  • the reflected signal, or return signal is received by the light detection and ranging system.
  • the system determines the range/di stance between the system and the object.
  • FIG. 1 illustrates an example schematic diagram of a LIDAR sensor 100 in accordance with the present technology.
  • the LIDAR sensor 100 includes a light emitter 101 that emits a light signal to an object 111 in an external environment and a receiver 103 that receives the return signal reflected by an object 11 1.
  • the LIDAR sensor 100 adopts a beam steering mechanism to steer the light signal to the object 1 1 1. in some
  • the LIDAR sensor 100 can be configured to rotate around an axis to achieve a larger field of view (e.g., a 360-degree horizontal field of view).
  • the receiver 103 may include an analog-to-digital (A2D) convertor (not shown) that converts the return signal to a
  • A2D analog-to-digital
  • the LIDAR sensor 100 may optionally include a controller 105.
  • the controller is a part of the LIDAR systems.
  • the controller 105 is in communication with both the light emitter 101 and the receiver 103 so that the controller 105 can control light emissions by the light emitter 101 and process the electrical signals from the receiver 103.
  • the controller 105 uses a set of points, such as a point cloud, based on the captured distances to represent at least a partial surface the object 111 in the environment.
  • the points in the point cloud can represent or indicate surface features of the object 1 1 1.
  • the light emitter 101 can emit multiple light beams.
  • FIG. 2 illustrates an example schematic diagram of a LIDAR system 200 in accordance with the present technology.
  • the LIDAR system 200 includes a LIDAR sensor 201 that emits a plurality' ⁇ of beams over an angular range.
  • the light emitter can emit 32 beams of light (also referred to as 32 channels).
  • the LIDAR system includes multiple LIDAR sensors to obtain a dense set of data for an accurate representation of the object.
  • FIG. 3 illustrates an example schematic diagram of a LIDAR system 300 in accordance with the present technology.
  • the LIDAR system 300 depicted in this example includes a first LIDAR sensor 301 and a second LIDAR sensor 303. Both of the sensors are capable of emitting multiple light beams. Because the sensors are positioned m proximity to each other within a single LIDAR system, crosstalk interference among the sensors can occur. Furthermore, with an increasing number of autonomous vehicles equipped with LIDAR systems to detect and avoid obstacles in the environment, direct interference among multiple LIDAR sensors may occur.
  • the reception of foreign light signals can lead to problems such as ghost targets or a reduced signal-to-noise ratio.
  • Disclosed herein are techniques that can be implemented in various embodiments to reduce or minimize the impact of interference among multiple LIDAR sensors, thereby providing a more reliable model to represent the environment for safe maneuvering.
  • the light emitter generates pulsed light emissions in response to a pulse electrical signal provided by the controller.
  • FIG. 4 depicts an example pulse electrical signal 400 generated by a controller.
  • the pulse signal 400 includes multiple uniformly or non-uniformly spaced pulses.
  • the pulses can be uniformly distributed in a period of time T p . That is, adjacent pulses are separated by the same distance of t in the time domain.
  • T p time period of time
  • adjacent pulses are separated by the same distance of t in the time domain.
  • direct or crosstalk interference among the signals is likely to occur.
  • inter-sensor interference can happen, which includes direct interference when sensors are pointing at each other and indirect interference when signals from multiple sensors bounce off the object(s).
  • FIGS. 5A-B illustrate example interference patterns that can appear in a point cloud.
  • the interference can appear as a curved surface in the point cloud data set. The curve surface can be interpreted as a real obstacle in the environment, thereby severely impacting computations of the controller and subsequent navigation of the vehicle that carries the LIDAR system.
  • One way to minimize the impact of such interference is to distribute the interference signals across the field of view of the LIDAR sensor. If the interference shows up as small ghost points located at various distances from the LIDAR sensor, there is a lower probability for the controller to interpret these signals as a solid obstacle.
  • the controller can control the light emitter to“dither” the pulse signal so that the firing time of the pulses is randomized in the period T p .
  • FIG. 6 illustrates an example dithered pulse signal in accordance with the present technology. A randomized At andom can be added or subtracted from t for each firing of the pulse. Thus, the adjacent pulses are separated by different distances (t ⁇ Atrandom) in the time domain.
  • Atrandom can be generated by a pseudo-random generation algorithm.
  • Atrandom is up to around 1 ps.
  • Atrandon i and Due to the nature of pseudo-random generation, occasionally two adjacent pulses may have the same spacing as compared to two pulses in another sequence. For example, fi in FIG. 6 can be equal to t in FIG. 4. However, the time-domain spacings of the pulses are sufficiently different due to the randomization so that the interference, when it does occur, is distributed in distance to form a more uniform noise pattern.
  • the firing sequence can be dynamically re-ordered to allow a larger range of Atrandom.
  • the dithering techniques as described herein can be combined with a pulse signal signature, described in U.S. Pub No. 2019/0178991, the entire content of which is incorporated by reference as part of the disclosure of this patent document to further reduce or minimize the impact of interference.
  • each pulse signal can be given a pulse signal signature.
  • each pulse signal has a unique signature to ensure the security and reliabi lity of the generated data.
  • the number of unique signatures available to the LIDAR sensor also referred to as the code base
  • two LIDAR sensors may have the same signature at the same time, leading to an interference pattern such as shown in FIG. 5.
  • Combining the dithering techniques with a code base of unique pulse signatures can greatly reduce the likelihood of signal interference by two LIDAR sensors sharing the same pulse signal.
  • FIG. 7 illustrates an example interference pattern after dithering the pulse signal in accordance with the present technology. As shown in FIG 7, the interference is more uniformly distributed to resemble a white noise pattern. The controller now can apply a filtering process to remove the points that are deemed as interference.
  • FIG. 8 illustrates an example of a filtering process in accordance with the present technology. In FIG. 8, a set of points has been collected to represent an object or at least a partial surface of the object in the environment. The collected points also include a significant amount of interference showing as noise. In order to determine whether a particular point is interference/noise or actual data representing the object, the controller can check information carried in the neighboring points to determine whether there is a coherency among the points.
  • the points in the point cloud dataset are represented using a Cartesian coordinate system.
  • Information carried by the neighboring points in x, y, and/or z directions can be used to determine whether the point and its neighbors represent the same surface.
  • point 801 can be checked against its neighboring points in x, y, and/or z directions.
  • the neighboring points along x direction of point 801 do not provide sufficient information because they also appear to be noise.
  • the neighboring points along y direction of point 801 indicate that there is no coherence between point 801 and some of the neighbors: they appear to represent two separate surfaces. Point 801 can thus be filtered out as a noise point.
  • point 802 can be checked against its neighboring points.
  • the neighboring points along both x and y directions indicate that there exists a coherence between point 802 and its neighbors.
  • point 802 remains in the model.
  • the points in the point cloud carry a normal vector N indicating an estimated surface normal of the object. Based on the normal vector N, two substantially orthogonal directions x’ and y’ can be determined. Information carried by the neighboring points in x’ and y’ can be used to determine whether the point and its neighbor represent the same surface. For example, in some embodiments, each point carries color information. If the point has the same color as its neighboring points (or if the color difference is equal to or less to a predefined threshold), a point coherence exists. As another example, if a distance between the point and its neighboring point exceeds a predefined threshold, the controller can determine that there is no coherence between the point and its neighbors. When there is no coherence between a point and its neighbors, the point is removed or filtered from the model as noise.
  • Examining neighboring points in two or more directions can provide a more reliable determination of the point coherence when the object has a sharp surface feature or a small dimension along one direction. For example, if the object is a thin, long pole, examining points along the x direction in the Cartesian coordinate system may not provide sufficient amount of information because there are only a few surface points available. Examining points along the y direction, however, allows the system to use more surface points of the pole to determine whether a coherence exists between the target point (potentially a noise point) and the object.
  • a bounding box is placed around the target point.
  • the spatial neighbors of the target point generated by prior and/or next signals from the same channel or concurrent/prior/next signals from a vertically adjacent channel are examined. If the distance between a neighboring point and the target point is equal to or less than a threshold, the target point can be considered as a coherent point to the neighboring point. Other factors, such as color of the object, reflectivity of the object, strength of the signal, and signal to noise ratio (SNR) of the signal are also considered to determine the coherence.
  • SNR signal to noise ratio
  • the LXDAR sensor may track a small set of points in the point cloud dataset instead of the entire dataset for the object(s) in the environment.
  • the LIDAR sensor or a filtering subsystem of the LIDAR sensor performs real-time filtering.
  • the real-time filtering can be performed to one object within a selected few frames, or multiple objects within one frame.
  • the controller or the filtering subsystem can also make certain assumptions about the object (e.g., smoothness, color, location, size) to facilitate real-time filtering of the
  • a confidence of validity of the filtering process can be derived based on the number of coherent points m the bounding box.
  • the LIDAR sensors can emit multiple sets of signals concurrently.
  • a combination of the dithering pattern (e.g., the burstiness of multiple channels) and a low confidence of validity in the filtering process can be used together to indicate whether the target point is a result of direct or crosstalk interference.
  • FIG. 9 is a flowchart representation of a method 900 for sensing an object in an external environment using a light detection and ranging device in accordance with the present technology.
  • the method 900 includes, at operation 902, emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal.
  • the electrical pulse signal comprises a first set of non-uniformly spaced pulses.
  • the method 900 includes, at operation 904, receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object.
  • the method 900 includes, at operation 906, converting, by the receiver, the one or more return light signals into electrical signals.
  • the method 900 includes, at operation 908, generating, based on the electrical signals, a model that comprises a plurality of points representing a surface of the object.
  • the method 900 includes, at operation 910, filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.
  • FIG. 10 is a block diagram illustrating an example of the architecture for a computer system or other control device 1000 that can be utilized to implement various portions of the presently disclosed technology, such as the controller 105 shown in FIG. 1.
  • the computer system 1000 includes one or more processors 1005 and memory 1010 connected via an interconnect 1025.
  • the interconnect 1025 may represent any one or more separate physical buses, point to point connections, or both, connected by appropriate bridges, adapters, or controllers.
  • the interconnect 1025 may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 674 bus, sometimes referred to as“Firewire.”
  • PCI Peripheral Component Interconnect
  • ISA HyperTransport or industry standard architecture
  • SCSI small computer system interface
  • USB universal serial bus
  • I2C IIC
  • IEEE Institute of Electrical and Electronics Engineers
  • the processor(s) 1005 may include central processing units (CPUs) to control the overall operation of, for example, the host computer. In certain embodiments, the processor(s) 1005 accomplish this by executing software or firmware stored in memory 1010.
  • the processor(s) 1005 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), or the like, or a combination of such devices.
  • the memory 1010 can be, or include, the main memory of the computer system.
  • the memory 1010 represents any suitable form of random-access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices.
  • the memory 1010 may contain, among other things, a set of machine instructions which, when executed by processor 1005, causes processor 1005 to perform operations to implement embodiments of the presently disclosed technology.
  • the network adapter 1015 provides the computer system 1000 with the ability to communicate with remote devices, such as the storage clients, and/or other storage servers, and may be, for example, an Ethernet adapter or Fiber Channel adapter.
  • a light detection and ranging apparatus includes a light emitter configured to generate, according to a first electrical pulse signal, a pulse light signal that is directed toward an object in an external environment.
  • the first electrical pulse signal comprises a first set of non-umformly spaced pulses.
  • the apparatus includes a receiver configured to receive one or more returned light signals reflected by the object and convert the returned light signals into electrical signals and a filtering subsystem in communication with the receiver.
  • the filtering subsystem is configured to receive the electrical signals from the receiver and remove a point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the point and corresponding neighboring points of the point along at least a first direction and a second direction of the set of points.
  • two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain.
  • the light emitter is configured to generate a second pulse light signal according to a second electrical pulse signal, wherein the second electrical pulse signal comprises a second set of non-uniformly spaced pulses, and wherein at least one pulse in the second set of pulses is positioned at a different time-domain location than a corresponding pulse in the first set of pulses.
  • the light emitter is configured to generate a third pulse light signal according to a third electrical pulse signal, wherein the third electrical pulse signal comprises a third set of non-uniformly spaced pulses, and wherein no pulse in the third set of pulses shares a same time-domain location as any pulse in the first set of pulses.
  • each of the first, second, and third electrical pulse signal comprises a pulse signature. The pulse signature can be unique for each of the first, second, and third electrical pulse signal.
  • each point carries information about a vector that indicates an estimated surface normal of the object, and wherein the first direction and the second direction are determined based on the vector.
  • the first direction and the second direction are two directions in a Cartesian coordinate system.
  • determining the coherence between the point and the corresponding neighboring points comprises examining information carried in the neighboring points of the point and determining whether the point and the neighboring points both represent the surface of the same object.
  • the information carried in a neighboring point includes at least a location of the neighboring point or a color of the neighboring point.
  • a method for sensing an object in an external environment using a light detection and ranging device includes emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal, the electrical pulse signal comprising a first set of non-uniformly spaced pulses.
  • the method includes receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object and converting, by the receiver, the one or more return light signals into electrical signals.
  • the method also includes filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.
  • the method includes emitting a second pulse light signal according to a second electrical pulse signal
  • the second electrical pulse signal comprises a second set of non-uniformly spaced pulses and at least one pulse in the second set of pulses is positioned at a different time-domain location than a corresponding pulse in the first set of pulses.
  • the method includes emitting a third pulse light signal according to a third electrical pulse signal.
  • the third electrical pulse signal comprises a third set of non-uniformly spaced pulses and no pulse in the third set of pulses shares a same time-domain location as any pulse in the first set of pulse.
  • each of the first, second, and third electrical pulse signal comprises a pulse signature. The pulse signature can be unique for each of the first, second, and third electrical pulse signal.
  • each point carries information about a vector that indicates an estimated surface normal of the object.
  • the method includes determining two orthogonal directions as the first direction and the second direction based on the vector.
  • the method includes selecting two orthogonal directions in a Cartesian coordinate system as the first direction and the second direction.
  • determining the coherence between the target point and the corresponding neighboring points includes examining information carried in the neighboring points of the target point and determining whether the target point and the neighboring points both represent the surface of the same object.
  • the information carried in a neighboring point includes a location of the neighboring point.
  • the method can include filtering out the target point upon determining that a distance between the neighboring point and the target point exceeds a predefined threshold.
  • the information carried in a neighboring point includes a color of the neighboring point. The method can include filtering out the target point upon determining that a color difference between the neighboring point and the target point exceeds a predefined threshold.
  • filtering the target point comprises constructing a bounding box for the target point, determining a number of neighboring points that are coherent with the target point, and filtering out the target point upon determining that the number of coherent neighboring points is equal to or smaller than a predefined threshold.
  • a n on-transitory computer readable medium having processor code stored thereon including program code for performing a method that comprises emitting, by a light emiter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal, the electrical pulse signal comprising a first set of non- uniformly spaced pulses.
  • the method includes receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object and converting, by the receiver, the one or more return light signals into electrical signals.
  • the method also includes filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.
  • two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain.
  • determining the coherence between the target point and the corresponding neighboring points comprises examining information carried in the neighboring points of the target point and determining whether the target point and the neighboring points both represent the surface of the same object.
  • the information carried in a neighboring point includes at least a location of the neighboring point or a color of the neighboring point.
  • filtering the target point comprises constructing a bounding box for the target point, determining a number of neighboring points that are coherent with the target point, and filtering out the target point upon determining that the number of coherent neighboring points is equal to or smaller than a predefined threshold.
  • Implementations of the subject matter and the functional operations described in this patent document can be implemented in various systems, digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • Implementations of the subject matter described m this specification can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a tangible and non-transitory computer readable medium for execution by, or to control the operation of, data processing apparatus.
  • the computer readable medium can be a machine- readable storage device, a machine-readable storage substrate, a memory' device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them.
  • data processing unit or“data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including, by way of example, a programmable processor, a computer, or multiple processors or computers.
  • the apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program does not necessarily correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • the processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
  • the processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry , e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • special purpose logic circuitry e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read only memory or a random-access memory, or both.
  • the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • a computer need not have such devices.
  • Computer readable media suitable for storing computer program instructions and data include all forms of nonvolatile memory, media and memory devices, including, by way of example, semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices.
  • semiconductor memory devices e.g., EPROM, EEPROM, and flash memory devices.
  • the processor and the memory can he supplemented by, or incorporated in, special purpose logic circuitry.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)

Abstract

Methods, apparatus, and systems related to light detection and ranging (LIDAR) are described. In one example aspect, a LIDAR apparatus includes a light emitter configured to generate, according to a first electrical pulse signal, a pulse light signal. The first electrical pulse signal comprises a first set of non-uniformly spaced pulses. The apparatus includes a receiver configured to convert returned light signals from the object into electrical signals and a filtering subsystem in communication with the receiver, configured to receive the electrical signals from the receiver and remove a point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the point and corresponding neighboring points of the point along at least a first direction and a second direction of the set of points.

Description

INTERFERENCE MITIGATION FOR LIGHT DETECTION AND RANGING
BACKGROUND
[0001] Light Detection and Ranging (LIDAR) is a remote sensing method that uses light in the form of a pulsed laser to measure variable distances to the environment. LIDAR systems have a wide range of applications including autonomous driving and aerial mapping of a surface. These applications may place a high priority on the security, accuracy and reliability of the operation. The disclosed embodiments provide methods and systems to reduce or minimize signal interference when multiple LIDAR sensors are present, thereby generating reliable data for determining surface features of the environment. The additional features and benefits of the disclosed technology, as well as their further applications and implementations, are described m the sections that follow.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] FIG. 1 illustrates an example schematic diagram of a LIDAR sensor in accordance with the present technology.
[0003] FIG. 2 illustrates an example schematic diagram of a LIDAR system in accordance with the present technology.
[0004] FIG. 3 illustrates another example schematic diagram of a LIDAR system in accordance with the present technology.
[0005] FIG. 4 depicts an example pulse signal generated by a controller.
[0006] FIG. 5A illustrates an example interference pattern that appears in a point cloud.
[0007] FIG. 5B illustrates an example interference pattern that appears in a point cloud.
[0008] FIG. 6 illustrates an example dithered pulse signal in accordance with the present technology.
[0009] FIG. 7 illustrates an example interference pattern after dithering the pulse signal in accordance with the present technology.
[0010] FIG. 8 illustrates an example of a filtering process in accordance with the present technology.
[0011] FIG. 9 is a flowchart representation of a method for sensing an object in an external environment using a light detection and ranging device in accordance with the present technology.
[0012] FIG. 10 is a block diagram illustrating an example of the architecture for a computer system or other control device that can he utilized to implement various portions of the presently- disclosed technology.
DETAILED DESCRIPTION
[0013] LID AR systems operate by transmitting a series of light pulses that reflect off objects. The reflected signal, or return signal, is received by the light detection and ranging system.
Based on the detected time-of-f!ight (TOP), the system determines the range/di stance between the system and the object.
[0014] FIG. 1 illustrates an example schematic diagram of a LIDAR sensor 100 in accordance with the present technology. The LIDAR sensor 100 includes a light emitter 101 that emits a light signal to an object 111 in an external environment and a receiver 103 that receives the return signal reflected by an object 11 1. in some embodiments, the LIDAR sensor 100 adopts a beam steering mechanism to steer the light signal to the object 1 1 1. in some
embodiments, the LIDAR sensor 100 can be configured to rotate around an axis to achieve a larger field of view (e.g., a 360-degree horizontal field of view). The receiver 103 may include an analog-to-digital (A2D) convertor (not shown) that converts the return signal to a
corresponding electrical signal. In some embodiments, the LIDAR sensor 100 may optionally include a controller 105. In some embodiments, the controller is a part of the LIDAR systems. The controller 105 is in communication with both the light emitter 101 and the receiver 103 so that the controller 105 can control light emissions by the light emitter 101 and process the electrical signals from the receiver 103. In some embodiments, the controller 105 uses a set of points, such as a point cloud, based on the captured distances to represent at least a partial surface the object 111 in the environment. The points in the point cloud can represent or indicate surface features of the object 1 1 1.
[0015] In some embodiments, the light emitter 101 can emit multiple light beams. FIG. 2 illustrates an example schematic diagram of a LIDAR system 200 in accordance with the present technology. The LIDAR system 200 includes a LIDAR sensor 201 that emits a plurality'· of beams over an angular range. In this particular example, the light emitter can emit 32 beams of light (also referred to as 32 channels). In some embodiments, the LIDAR system includes multiple LIDAR sensors to obtain a dense set of data for an accurate representation of the object. FIG. 3 illustrates an example schematic diagram of a LIDAR system 300 in accordance with the present technology. The LIDAR system 300 depicted in this example includes a first LIDAR sensor 301 and a second LIDAR sensor 303. Both of the sensors are capable of emitting multiple light beams. Because the sensors are positioned m proximity to each other within a single LIDAR system, crosstalk interference among the sensors can occur. Furthermore, with an increasing number of autonomous vehicles equipped with LIDAR systems to detect and avoid obstacles in the environment, direct interference among multiple LIDAR sensors may occur.
The reception of foreign light signals can lead to problems such as ghost targets or a reduced signal-to-noise ratio. Disclosed herein are techniques that can be implemented in various embodiments to reduce or minimize the impact of interference among multiple LIDAR sensors, thereby providing a more reliable model to represent the environment for safe maneuvering.
[0016] In some LIDAR systems, the light emitter generates pulsed light emissions in response to a pulse electrical signal provided by the controller. FIG. 4 depicts an example pulse electrical signal 400 generated by a controller. The pulse signal 400 includes multiple uniformly or non-uniformly spaced pulses. For example, the pulses can be uniformly distributed in a period of time Tp. That is, adjacent pulses are separated by the same distance of t in the time domain. As discussed above, when there are multiple light beams generated according to the same pulse signal, direct or crosstalk interference among the signals is likely to occur. For example, for multi-sensor LIDAR systems, inter-sensor interference can happen, which includes direct interference when sensors are pointing at each other and indirect interference when signals from multiple sensors bounce off the object(s). For single-sensor multi-beam LIDAR systems, intra-sensor interference can happen. Signal s from multiple beams can bounce off the object(s) and form an interference pattern. When signals at the same time-domain location interfere, the resulting interference may be transformed into ghost targets that are located at approximately the same distance from the LIDAR sensor. FIGS. 5A-B illustrate example interference patterns that can appear in a point cloud. In these examples, the interference can appear as a curved surface in the point cloud data set. The curve surface can be interpreted as a real obstacle in the environment, thereby severely impacting computations of the controller and subsequent navigation of the vehicle that carries the LIDAR system.
[0017] One way to minimize the impact of such interference is to distribute the interference signals across the field of view of the LIDAR sensor. If the interference shows up as small ghost points located at various distances from the LIDAR sensor, there is a lower probability for the controller to interpret these signals as a solid obstacle. Instead of generating pulses that are uniformly spaced in the time domain, the controller can control the light emitter to“dither” the pulse signal so that the firing time of the pulses is randomized in the period Tp. FIG. 6 illustrates an example dithered pulse signal in accordance with the present technology. A randomized At andom can be added or subtracted from t for each firing of the pulse. Thus, the adjacent pulses are separated by different distances (t ± Atrandom) in the time domain. For example, Atrandom can be generated by a pseudo-random generation algorithm. In some embodiments, Atrandom is up to around 1 ps. In the specific example shown in FIG. 6, ti = t ± Atrandomi, t?. = t ± Atnmdona
Figure imgf000006_0002
Atrandon i, and
Figure imgf000006_0001
Due to the nature of pseudo-random generation, occasionally two adjacent pulses may have the same spacing as compared to two pulses in another sequence. For example, fi in FIG. 6 can be equal to t in FIG. 4. However, the time-domain spacings of the pulses are sufficiently different due to the randomization so that the interference, when it does occur, is distributed in distance to form a more uniform noise pattern.
[0018] In some embodiments, the firing sequence can be dynamically re-ordered to allow a larger range of Atrandom. In some embodiments, the dithering techniques as described herein can be combined with a pulse signal signature, described in U.S. Pub No. 2019/0178991, the entire content of which is incorporated by reference as part of the disclosure of this patent document to further reduce or minimize the impact of interference. For example, each pulse signal can be given a pulse signal signature. In some embodiments, each pulse signal has a unique signature to ensure the security and reliabi lity of the generated data. In some embodiments, the number of unique signatures available to the LIDAR sensor (also referred to as the code base) may be limited by various factors. Given a small code base, two LIDAR sensors may have the same signature at the same time, leading to an interference pattern such as shown in FIG. 5.
Combining the dithering techniques with a code base of unique pulse signatures can greatly reduce the likelihood of signal interference by two LIDAR sensors sharing the same pulse signal.
[0019] FIG. 7 illustrates an example interference pattern after dithering the pulse signal in accordance with the present technology. As shown in FIG 7, the interference is more uniformly distributed to resemble a white noise pattern. The controller now can apply a filtering process to remove the points that are deemed as interference. [0020] FIG. 8 illustrates an example of a filtering process in accordance with the present technology. In FIG. 8, a set of points has been collected to represent an object or at least a partial surface of the object in the environment. The collected points also include a significant amount of interference showing as noise. In order to determine whether a particular point is interference/noise or actual data representing the object, the controller can check information carried in the neighboring points to determine whether there is a coherency among the points.
[0021] In some embodiments, the points in the point cloud dataset are represented using a Cartesian coordinate system. Information carried by the neighboring points in x, y, and/or z directions, such as the location or the color, can be used to determine whether the point and its neighbors represent the same surface. For example, as shown in FIG. 8, point 801 can be checked against its neighboring points in x, y, and/or z directions. The neighboring points along x direction of point 801 do not provide sufficient information because they also appear to be noise. However, the neighboring points along y direction of point 801 indicate that there is no coherence between point 801 and some of the neighbors: they appear to represent two separate surfaces. Point 801 can thus be filtered out as a noise point. Similarly, point 802 can be checked against its neighboring points. The neighboring points along both x and y directions indicate that there exists a coherence between point 802 and its neighbors. Thus, point 802 remains in the model.
[0022] In some embodiments, the points in the point cloud carry a normal vector N indicating an estimated surface normal of the object. Based on the normal vector N, two substantially orthogonal directions x’ and y’ can be determined. Information carried by the neighboring points in x’ and y’ can be used to determine whether the point and its neighbor represent the same surface. For example, in some embodiments, each point carries color information. If the point has the same color as its neighboring points (or if the color difference is equal to or less to a predefined threshold), a point coherence exists. As another example, if a distance between the point and its neighboring point exceeds a predefined threshold, the controller can determine that there is no coherence between the point and its neighbors. When there is no coherence between a point and its neighbors, the point is removed or filtered from the model as noise.
[0023] Examining neighboring points in two or more directions can provide a more reliable determination of the point coherence when the object has a sharp surface feature or a small dimension along one direction. For example, if the object is a thin, long pole, examining points along the x direction in the Cartesian coordinate system may not provide sufficient amount of information because there are only a few surface points available. Examining points along the y direction, however, allows the system to use more surface points of the pole to determine whether a coherence exists between the target point (potentially a noise point) and the object.
[0024] in some embodiments, a bounding box is placed around the target point. The spatial neighbors of the target point generated by prior and/or next signals from the same channel or concurrent/prior/next signals from a vertically adjacent channel are examined. If the distance between a neighboring point and the target point is equal to or less than a threshold, the target point can be considered as a coherent point to the neighboring point. Other factors, such as color of the object, reflectivity of the object, strength of the signal, and signal to noise ratio (SNR) of the signal are also considered to determine the coherence. In some implementations, when the number of coherent points m the bounding box exceeds a predetermine threshold (e.g., nine neighboring returns), the target point is considered a valid data point instead of
noise/interference.
[0025] In some embodiments, to obtain optimal real-time performance, the LXDAR sensor may track a small set of points in the point cloud dataset instead of the entire dataset for the object(s) in the environment. Thus, instead of filtering the entire dataset at once, the LIDAR sensor or a filtering subsystem of the LIDAR sensor performs real-time filtering. The real-time filtering can be performed to one object within a selected few frames, or multiple objects within one frame. The controller or the filtering subsystem can also make certain assumptions about the object (e.g., smoothness, color, location, size) to facilitate real-time filtering of the
noise/interference points.
[0026] In some implementations, a confidence of validity of the filtering process can be derived based on the number of coherent points m the bounding box. As discussed above, the LIDAR sensors can emit multiple sets of signals concurrently. A combination of the dithering pattern (e.g., the burstiness of multiple channels) and a low confidence of validity in the filtering process can be used together to indicate whether the target point is a result of direct or crosstalk interference.
[0027] FIG. 9 is a flowchart representation of a method 900 for sensing an object in an external environment using a light detection and ranging device in accordance with the present technology. The method 900 includes, at operation 902, emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal. The electrical pulse signal comprises a first set of non-uniformly spaced pulses. The method 900 includes, at operation 904, receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object. The method 900 includes, at operation 906, converting, by the receiver, the one or more return light signals into electrical signals. The method 900 includes, at operation 908, generating, based on the electrical signals, a model that comprises a plurality of points representing a surface of the object. The method 900 includes, at operation 910, filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.
[0028] FIG. 10 is a block diagram illustrating an example of the architecture for a computer system or other control device 1000 that can be utilized to implement various portions of the presently disclosed technology, such as the controller 105 shown in FIG. 1. In FIG 10, the computer system 1000 includes one or more processors 1005 and memory 1010 connected via an interconnect 1025. The interconnect 1025 may represent any one or more separate physical buses, point to point connections, or both, connected by appropriate bridges, adapters, or controllers. The interconnect 1025, therefore, may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 674 bus, sometimes referred to as“Firewire.”
[0029] The processor(s) 1005 may include central processing units (CPUs) to control the overall operation of, for example, the host computer. In certain embodiments, the processor(s) 1005 accomplish this by executing software or firmware stored in memory 1010. The processor(s) 1005 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), or the like, or a combination of such devices.
[0030] The memory 1010 can be, or include, the main memory of the computer system. The memory 1010 represents any suitable form of random-access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices. In use, the memory 1010 may contain, among other things, a set of machine instructions which, when executed by processor 1005, causes processor 1005 to perform operations to implement embodiments of the presently disclosed technology.
[0031] Also connected to the processor(s) 1005 through the interconnect 1025 is a (optional) network adapter 1015. The network adapter 1015 provides the computer system 1000 with the ability to communicate with remote devices, such as the storage clients, and/or other storage servers, and may be, for example, an Ethernet adapter or Fiber Channel adapter.
[0032] it is thus evident that the disclosed techniques can be implemented in various embodiments to effectively reduce the impact of signal interferences in LIDAR sensors and systems. In one example aspect, a light detection and ranging apparatus includes a light emitter configured to generate, according to a first electrical pulse signal, a pulse light signal that is directed toward an object in an external environment. The first electrical pulse signal comprises a first set of non-umformly spaced pulses. The apparatus includes a receiver configured to receive one or more returned light signals reflected by the object and convert the returned light signals into electrical signals and a filtering subsystem in communication with the receiver. The filtering subsystem is configured to receive the electrical signals from the receiver and remove a point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the point and corresponding neighboring points of the point along at least a first direction and a second direction of the set of points.
[0033] In some embodiments, two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain. In some embodiments, the light emitter is configured to generate a second pulse light signal according to a second electrical pulse signal, wherein the second electrical pulse signal comprises a second set of non-uniformly spaced pulses, and wherein at least one pulse in the second set of pulses is positioned at a different time-domain location than a corresponding pulse in the first set of pulses. In some embodiments, the light emitter is configured to generate a third pulse light signal according to a third electrical pulse signal, wherein the third electrical pulse signal comprises a third set of non-uniformly spaced pulses, and wherein no pulse in the third set of pulses shares a same time-domain location as any pulse in the first set of pulses. In some embodiments, each of the first, second, and third electrical pulse signal comprises a pulse signature. The pulse signature can be unique for each of the first, second, and third electrical pulse signal.
[0034] In some embodiments, each point carries information about a vector that indicates an estimated surface normal of the object, and wherein the first direction and the second direction are determined based on the vector. In some embodiments, the first direction and the second direction are two directions in a Cartesian coordinate system.
[0035] In some embodiments, determining the coherence between the point and the corresponding neighboring points comprises examining information carried in the neighboring points of the point and determining whether the point and the neighboring points both represent the surface of the same object. In some embodiments, the information carried in a neighboring point includes at least a location of the neighboring point or a color of the neighboring point.
[0036] In another example aspect, a method for sensing an object in an external environment using a light detection and ranging device includes emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal, the electrical pulse signal comprising a first set of non-uniformly spaced pulses. The method includes receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object and converting, by the receiver, the one or more return light signals into electrical signals. The method also includes filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.
[0037] In some embodiments, two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain. In some embodiments, the method includes emitting a second pulse light signal according to a second electrical pulse signal The second electrical pulse signal comprises a second set of non-uniformly spaced pulses and at least one pulse in the second set of pulses is positioned at a different time-domain location than a corresponding pulse in the first set of pulses. In some embodiments, the method includes emitting a third pulse light signal according to a third electrical pulse signal. The third electrical pulse signal comprises a third set of non-uniformly spaced pulses and no pulse in the third set of pulses shares a same time-domain location as any pulse in the first set of pulse. In some embodiments, each of the first, second, and third electrical pulse signal comprises a pulse signature. The pulse signature can be unique for each of the first, second, and third electrical pulse signal.
[0038] In some embodiments, each point carries information about a vector that indicates an estimated surface normal of the object. The method includes determining two orthogonal directions as the first direction and the second direction based on the vector. In some embodiments, the method includes selecting two orthogonal directions in a Cartesian coordinate system as the first direction and the second direction.
[0039] In some embodiments, determining the coherence between the target point and the corresponding neighboring points includes examining information carried in the neighboring points of the target point and determining whether the target point and the neighboring points both represent the surface of the same object. In some embodiments, the information carried in a neighboring point includes a location of the neighboring point. In some embodiments, the method can include filtering out the target point upon determining that a distance between the neighboring point and the target point exceeds a predefined threshold. In some embodiments, the information carried in a neighboring point includes a color of the neighboring point. The method can include filtering out the target point upon determining that a color difference between the neighboring point and the target point exceeds a predefined threshold.
[0040] In some embodiments, filtering the target point comprises constructing a bounding box for the target point, determining a number of neighboring points that are coherent with the target point, and filtering out the target point upon determining that the number of coherent neighboring points is equal to or smaller than a predefined threshold.
[0041] In another example aspect, a n on-transitory computer readable medium having processor code stored thereon including program code for performing a method that comprises emitting, by a light emiter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal, the electrical pulse signal comprising a first set of non- uniformly spaced pulses. The method includes receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object and converting, by the receiver, the one or more return light signals into electrical signals. The method also includes filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model. [0042] In some embodiments, two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain. In some embodiments, determining the coherence between the target point and the corresponding neighboring points comprises examining information carried in the neighboring points of the target point and determining whether the target point and the neighboring points both represent the surface of the same object. In some embodiments, the information carried in a neighboring point includes at least a location of the neighboring point or a color of the neighboring point. In some embodiments, filtering the target point comprises constructing a bounding box for the target point, determining a number of neighboring points that are coherent with the target point, and filtering out the target point upon determining that the number of coherent neighboring points is equal to or smaller than a predefined threshold.
[0043] Implementations of the subject matter and the functional operations described in this patent document can be implemented in various systems, digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the subject matter described m this specification can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a tangible and non-transitory computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine- readable storage device, a machine-readable storage substrate, a memory' device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term“data processing unit” or“data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including, by way of example, a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
[0044] A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
[0045] The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry , e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
[0046] Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random-access memory, or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of nonvolatile memory, media and memory devices, including, by way of example, semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices. The processor and the memory can he supplemented by, or incorporated in, special purpose logic circuitry.
[0047] It is intended that the specification, together with the drawings, be considered exemplary only where exemplary means an example. As used herein, the singular forms“a,” “an,” and“the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Additionally, the use of“or” is intended to include“and/or,” unless the context clearly indicates otherwise.
[0048] While this patent document contains many specifics, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this patent document, in the context of separate embodiments, can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple
embodiments separately or m any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
[0049] Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown, or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Moreover, the separation of various system components in the embodiments described in this patent document should not be understood as requiring such separation in all
embodiments.
[0050] Only a few implementations and examples are described, and other implementations, enhancements and variations can be made based on what is described and illustrated in this patent document.

Claims

CLAIMS A set of example claims include:
1. A light detection and ranging apparatus, comprising:
a light emitter configured to generate, according to a first electrical pulse signal, a pulse light signal that is directed toward an object in an external environment, the first electrical pulse signal comprising a first set of non-uniformly spaced pulses;
a receiver configured to receive one or more returned light signals reflected by the object and convert the returned light signals into electrical signals; and
a filtering subsystem in communication with the receiver, configured to:
receive the electrical signals from the receiver, and
remove a point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the point and corresponding neighboring points of the point along at least a first direction and a second direction of the set of points.
2. The apparatus of claim 1, wherein two adjacent pulses of the first set of pulses are separated by a randomized distance m time domain.
3. The apparatus of claim 1, wherein the light emitter is configured to generate a second pulse light signal according to a second electrical pulse signal, wherein the second electrical pulse signal comprises a second set of non-uniformly spaced pulses, and wherein at least one pulse in the second set of pulses is positioned at a different time-domain location than a corresponding pulse in the first set of pulses.
4. The apparatus of claim 1, wherein the light emiter is configured to generate a third pulse light signal according to a third electrical pulse signal, wherein the third electrical pulse signal comprises a third set of non-uniformly spaced pulses, and wherein no pulse in the third set of pulses shares a same time-domain location as any pulse in the first set of pulses.
5. The apparatus of any of claims 1 to 4, wherein each of the first, second, and third electrical pulse signal comprises a pulse signature.
6. The apparatus of claim 5, wherein the pulse signature is unique for each of the first, second, and third electrical pulse signal.
7. The apparatus of claim 1, wherein each point carries information about a vector that indicates an estimated surface normal of the object, and wherein the first direction and the second direction are determined based on the vector.
8. The apparatus of claim 1, wherein the first direction and the second direction are two directions in a Cartesian coordinate system.
9. The apparatus of claim 1 , wherein determining the coherence between the point and the corresponding neighboring points comprises:
examining information carried m the neighboring points of the point; and
determining whether the point and the neighboring points both represent the surface of the same object.
10. The apparatus of claim 1 , wherein the information carried in a neighboring point includes at least a location of the neighboring point or a color of the neighboring point.
11. A method for sensing an object in an external environment using a light detection and ranging device, comprising:
emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal, the electrical pulse signal comprising a first set of non-uniformly spaced pulses;
receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object;
converting, by the receiver, the one or more return light signals into electrical signals; and filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.
12. The method of claim 11, wherein two adjacent pulses of the first set of pulses are separated by a randomized distance m time domain.
13. The method of claim 11, comprising:
emitting a second pulse light signal according to a second electrical pulse signal, wherein the second electrical pulse signal comprises a second set of non-umformly spaced pulses, and wherein at least one pulse in the second set of pulses is positioned at a different time-domain location than a corresponding pulse in the first set of pulses.
14. The method of claim 11, comprising:
emitting a third pulse light signal according to a third electrical pulse signal, wherein the third electrical pulse signal comprises a third set of non-uniformly spaced pulses, and wherein no pulse in the third set of pulses shares a same time-domain location as any pulse m the first set of pulse.
15. The method of claim 14, wherein each of the first, second, and third electrical pulse signal comprises a pulse signature.
16. The method of claim 15, wherein the pulse signature is unique for each of the first, second, and third electrical pulse signal.
17. The method of claim 11, wherein each point carries information about a vector that indicates an estimated surface normal of the object, and wherein the method comprises:
determining two orthogonal directions as the first direction and the second direction based on the vector.
18. The method of claim 11, comprising: selecting two orthogonal directions in a Cartesian coordinate system as the first direction and the second direction
19. The method of claim 1 1 , wherein determining the coherence between the target point and the corresponding neighboring points comprises:
examining information carried m the neighboring points of the target point; and determining whether the target point and the neighboring points both represent the surface of the same object.
20. The method of claim 11 , wherein the information carried in a neighboring point includes a location of the neighboring point.
21. The method of claim 20, comprising:
filtering out the target point upon determining that a distance between the neighboring point and the target point exceeds a predefined threshold.
22. The method of claim 11, wherein the information carried in a neighboring point includes a color of the neighboring point.
23. The method of claim 22, comprising:
filtering out the target point upon determining that a color difference between the neighboring point and the target point exceeds a predefined threshold.
24. The method of claim 11, wherein filtering the target point comprises:
constructing a bounding box for the target point;
determining a number of neighboring points that are coherent with the target point; and filtering out the target point upon determining that the number of coherent neighboring points is equal to or smaller than a predefined threshold.
25. A non-transitory computer readable medium having processor code stored thereon including program code for performing a method that comprises: emitting, by a light emitter of the light detection and range device, a pulse light signal towards the object according to an electrical pulse signal, the electrical pulse signal comprising a first set of non-uniformly spaced pulses;
receiving, by a receiver of the light detection and range device, one or more returned light signals reflected by the object;
converting, by the receiver, the one or more return light signals into electrical signals; and filtering a target point from a set of points representing at least a partial surface of the object as noise by determining whether there is a coherence between the target point and corresponding neighboring points of the target point along a first direction and a second direction of the model.
26. The non-transitory computer readable medium of claim 25, wherein two adjacent pulses of the first set of pulses are separated by a randomized distance in time domain.
27. The non-transitory computer readable medium of claim 25, wherein determining the coherence between the target point and the corresponding neighboring points comprises:
examining information carried in the neighboring points of the target point; and determining whether the target point and the neighboring points both represent the surface of the same object.
28. The non-transitory computer readable medium of claim 25, wherein the information carried in a neighboring point includes at least a location of the neighboring point or a color of the neighboring point.
29. The non-transitory computer readable medium of claim 25, wherein filtering the target point comprises:
constructing a bounding box for the target point;
determining a number of neighboring points that are coherent with the target point; and filtering out the target point upon determining that the number of coherent neighboring points is equal to or smaller than a predefined threshold.
PCT/US2020/026925 2019-07-01 2020-04-06 Interference mitigation for light detection and ranging WO2021002912A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
JP2022500148A JP7546647B2 (en) 2019-07-01 2020-04-06 Interference mitigation for light detection and ranging.
MX2021016061A MX2021016061A (en) 2019-07-01 2020-04-06 Interference mitigation for light detection and ranging.
EP20834220.4A EP3973316A4 (en) 2019-07-01 2020-04-06 Interference mitigation for light detection and ranging
CA3144656A CA3144656A1 (en) 2019-07-01 2020-04-06 Interference mitigation for light detection and ranging
KR1020227003050A KR20220025872A (en) 2019-07-01 2020-04-06 Interference mitigation for light detection and ranging
CN202080060860.8A CN114270215A (en) 2019-07-01 2020-04-06 Interference mitigation for light detection and ranging
IL289131A IL289131B2 (en) 2019-07-01 2021-12-19 Interference mitigation for light detection and ranging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/459,557 2019-07-01
US16/459,557 US10613203B1 (en) 2019-07-01 2019-07-01 Interference mitigation for light detection and ranging

Publications (1)

Publication Number Publication Date
WO2021002912A1 true WO2021002912A1 (en) 2021-01-07

Family

ID=70056578

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/026925 WO2021002912A1 (en) 2019-07-01 2020-04-06 Interference mitigation for light detection and ranging

Country Status (9)

Country Link
US (2) US10613203B1 (en)
EP (1) EP3973316A4 (en)
JP (1) JP7546647B2 (en)
KR (1) KR20220025872A (en)
CN (1) CN114270215A (en)
CA (1) CA3144656A1 (en)
IL (1) IL289131B2 (en)
MX (1) MX2021016061A (en)
WO (1) WO2021002912A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022550200A (en) * 2019-10-02 2022-11-30 セプトン テクノロジーズ,インコーポレイテッド Crosstalk Interference Detection Technology in Lidar Imaging Sensors

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE46672E1 (en) 2006-07-13 2018-01-16 Velodyne Lidar, Inc. High definition LiDAR system
US10627490B2 (en) 2016-01-31 2020-04-21 Velodyne Lidar, Inc. Multiple pulse, LIDAR based 3-D imaging
JP7149256B2 (en) 2016-03-19 2022-10-06 ベロダイン ライダー ユーエスエー,インコーポレイテッド Integrated illumination and detection for LIDAR-based 3D imaging
US10393877B2 (en) 2016-06-01 2019-08-27 Velodyne Lidar, Inc. Multiple pixel scanning LIDAR
JP7290571B2 (en) 2017-03-31 2023-06-13 ベロダイン ライダー ユーエスエー,インコーポレイテッド Integrated LIDAR lighting output control
JP2020519881A (en) 2017-05-08 2020-07-02 ベロダイン ライダー, インク. LIDAR data collection and control
EP3646057A1 (en) 2017-06-29 2020-05-06 Apple Inc. Time-of-flight depth mapping with parallax compensation
US11294041B2 (en) 2017-12-08 2022-04-05 Velodyne Lidar Usa, Inc. Systems and methods for improving detection of a return signal in a light ranging and detection system
WO2019125349A1 (en) 2017-12-18 2019-06-27 Montrose Laboratories Llc Time-of-flight sensing using an addressable array of emitters
US11971507B2 (en) 2018-08-24 2024-04-30 Velodyne Lidar Usa, Inc. Systems and methods for mitigating optical crosstalk in a light ranging and detection system
US10712434B2 (en) 2018-09-18 2020-07-14 Velodyne Lidar, Inc. Multi-channel LIDAR illumination driver
US11082010B2 (en) 2018-11-06 2021-08-03 Velodyne Lidar Usa, Inc. Systems and methods for TIA base current detection and compensation
US12061263B2 (en) 2019-01-07 2024-08-13 Velodyne Lidar Usa, Inc. Systems and methods for a configurable sensor system
US11885958B2 (en) 2019-01-07 2024-01-30 Velodyne Lidar Usa, Inc. Systems and methods for a dual axis resonant scanning mirror
KR102604902B1 (en) 2019-02-11 2023-11-21 애플 인크. Depth sensing using sparse arrays of pulsed beams
US11500094B2 (en) 2019-06-10 2022-11-15 Apple Inc. Selection of pulse repetition intervals for sensing time of flight
US10613203B1 (en) 2019-07-01 2020-04-07 Velodyne Lidar, Inc. Interference mitigation for light detection and ranging
US11555900B1 (en) 2019-07-17 2023-01-17 Apple Inc. LiDAR system with enhanced area coverage
US11733359B2 (en) 2019-12-03 2023-08-22 Apple Inc. Configurable array of single-photon detectors
US11892572B1 (en) 2020-12-30 2024-02-06 Waymo Llc Spatial light modulator retroreflector mitigation
CN115079115A (en) * 2021-03-10 2022-09-20 上海禾赛科技有限公司 Radar, data processing method and device for radar, and readable storage medium
MX2023011405A (en) * 2021-04-07 2024-01-17 Hesai Technology Co Ltd Laser radar and distance measuring method.
CN113189605B (en) * 2021-04-08 2022-06-17 中电海康集团有限公司 Method and system for improving laser ranging precision based on uncertainty
US11681028B2 (en) 2021-07-18 2023-06-20 Apple Inc. Close-range measurement of time of flight using parallax shift
EP4202499A1 (en) * 2021-12-23 2023-06-28 Suteng Innovation Technology Co., Ltd Lidar anti-interference method and apparatus, storage medium, and lidar

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5563706A (en) * 1993-08-24 1996-10-08 Nikon Corporation Interferometric surface profiler with an alignment optical member
US6201236B1 (en) * 1997-11-13 2001-03-13 Auto Sense Ltd. Detection system with improved noise tolerance
US7131586B2 (en) * 2000-06-07 2006-11-07 Metrologic Instruments, Inc. Method of and apparatus for reducing speckle-pattern noise in a planar laser illumination and imaging (PLIIM) based system
US20150109290A1 (en) 2013-10-22 2015-04-23 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Device and method for removing noise points in point clouds
US20150144806A1 (en) * 2012-05-29 2015-05-28 Macquarie University Two-directional scanning for luminescence microscopy
US20160252617A1 (en) 2015-02-27 2016-09-01 Denso Corporation Object recognition apparatus and noise removal method
US20160327646A1 (en) 2015-05-07 2016-11-10 GM Global Technology Operations LLC Pseudo random sequences in array lidar systems
US20180058197A1 (en) * 2015-12-28 2018-03-01 Halliburton Energy Services, Inc. Distributed optical sensing using compressive sampling
WO2018129408A1 (en) 2017-01-05 2018-07-12 Innovusion Ireland Limited Method and system for encoding and decoding lidar
US20190056497A1 (en) 2017-03-01 2019-02-21 Ouster, Inc. Accurate photo detector measurements for lidar

Family Cites Families (337)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3064252A (en) 1952-03-31 1962-11-13 Arthur A Varela Height finding radar system
US3636250A (en) 1964-02-26 1972-01-18 Andrew V Haeff Apparatus for scanning and reproducing a three-dimensional representation of an object
US3373441A (en) 1966-06-17 1968-03-12 Ernest A. Zadig Laser speed detector
US3551845A (en) 1968-05-09 1970-12-29 Gen Systems Inc Transistor-magnetic oscillators incorporating voltage reference means to regulate the output frequency
US4944036A (en) 1970-12-28 1990-07-24 Hyatt Gilbert P Signature filter system
US3686514A (en) 1971-07-16 1972-08-22 Ney Co J M Slip ring assembly
US3781111A (en) 1972-03-16 1973-12-25 Nasa Short range laser obstacle detector
US3897150A (en) 1972-04-03 1975-07-29 Hughes Aircraft Co Scanned laser imaging and ranging system
US5026156A (en) 1972-07-24 1991-06-25 Martin Marietta Corporation Method and system for pulse interval modulation
US5023888A (en) 1972-07-24 1991-06-11 Martin Marietta Corporation Pulse code recognition method and system
US3862415A (en) 1972-10-31 1975-01-21 Gen Electric Opto-electronic object detector using semiconductor light source
AT353487B (en) 1977-05-31 1979-11-12 Plasser Bahnbaumasch Franz MEASURING DEVICE FOR DISPLAY OR REGISTRATION OF THE PROFILE OF TUNNEL PIPES, THROUGHOUTS, ETC. CLOGGING
DE2744130A1 (en) 1977-09-30 1979-04-12 Siemens Ag DEVICE FOR CONTACT-FREE MEASUREMENT OF THE DISTANCE OF A SURFACE OF AN OBJECT FROM A REFERENCE PLANE
US4199697A (en) 1978-07-05 1980-04-22 Northern Telecom Limited Pulse amplitude modulation sampling gate including filtering
CH640050A5 (en) 1978-07-20 1983-12-15 Kern & Co Ag METHOD AND DEVICE FOR MEASURING THE RELATIVE POSITION BETWEEN A FIRST AND AT LEAST A SECOND POINT.
JPS5525883U (en) 1978-08-10 1980-02-19
US4201442A (en) 1978-10-02 1980-05-06 Sperry Corporation Liquid crystal switching coupler matrix
JPS5596475A (en) 1979-01-19 1980-07-22 Nissan Motor Co Ltd Obstacle detector for vehicle
JPS58211677A (en) 1982-06-02 1983-12-09 Nissan Motor Co Ltd Optical radar device
US4516837A (en) 1983-02-22 1985-05-14 Sperry Corporation Electro-optical switch for unpolarized optical signals
US4700301A (en) 1983-11-02 1987-10-13 Dyke Howard L Method of automatically steering agricultural type vehicles
GB2158232B (en) 1984-04-25 1987-11-18 Matsushita Electric Works Ltd Object detecting apparatus including photosensors for restricted detection area
DE3530646A1 (en) 1985-08-28 1987-03-12 Telenot Electronic Gmbh AREA SECURING
US4834531A (en) 1985-10-31 1989-05-30 Energy Optics, Incorporated Dead reckoning optoelectronic intelligent docking system
US5241481A (en) 1987-06-22 1993-08-31 Arnex Handelsbolag Method and a device for laser optical navigation
US4902126A (en) 1988-02-09 1990-02-20 Fibertek, Inc. Wire obstacle avoidance system for helicopters
US4896343A (en) 1988-05-02 1990-01-23 Saunders Allan M Radiation apparatus with distance mapper for dose control
US4967183A (en) 1988-05-18 1990-10-30 Eastman Kodak Company Method of intrusion detection over a wide area
US4952911A (en) 1988-05-18 1990-08-28 Eastman Kodak Company Scanning intrusion detection device
US4862257A (en) 1988-07-07 1989-08-29 Kaman Aerospace Corporation Imaging lidar system
US4895440A (en) 1988-08-22 1990-01-23 Spectra-Physics, Inc. Laser-based measurement system
US5710417A (en) 1988-10-21 1998-01-20 Symbol Technologies, Inc. Bar code reader for reading both one dimensional and two dimensional symbologies with programmable resolution
US5004916A (en) 1989-07-28 1991-04-02 Ncr Corporation Scanning system having automatic laser shutdown upon detection of defective scanning element motion
US5291261A (en) 1990-02-06 1994-03-01 Motorola, Inc. Optical object detection system incorporating fiber optic coupling
US5175694A (en) 1990-02-08 1992-12-29 The United States Of America As Represented By The Secretary Of The Navy Centroid target tracking system utilizing parallel processing of digital data patterns
US5006721A (en) 1990-03-23 1991-04-09 Perceptron, Inc. Lidar scanning system
US5059008A (en) 1990-03-26 1991-10-22 General Electric Company Wide angle beam steerer using translation of plural lens arrays
EP0464263A3 (en) 1990-06-27 1992-06-10 Siemens Aktiengesellschaft Device for obstacle detection for pilots of low flying aircrafts
US5249157A (en) 1990-08-22 1993-09-28 Kollmorgen Corporation Collision avoidance system
JP2975424B2 (en) 1990-11-14 1999-11-10 株式会社トプコン Lightwave ranging device
IE71181B1 (en) 1991-01-29 1997-01-29 Proximeter Co Ltd Proximity detector
US5463384A (en) 1991-02-11 1995-10-31 Auto-Sense, Ltd. Collision avoidance system for vehicles
DE4130619A1 (en) 1991-09-14 1993-03-25 Deutsche Aerospace OBJECT PROTECTION
US5177768A (en) * 1991-11-22 1993-01-05 Bell Communications Research, Inc. Spread-time code division multiple access technique with arbitrary spectral shaping
US5309212A (en) 1992-09-04 1994-05-03 Yaskawa Electric Corporation Scanning rangefinder with range to frequency conversion
US6333121B1 (en) 1992-10-13 2001-12-25 General Electric Company Low-sulfur article having a platinum-aluminide protective layer and its preparation
US5838239A (en) 1992-10-20 1998-11-17 Robotic Vision Systems, Inc. System for detecting ice or snow on surface which specularly reflects light
US5546188A (en) 1992-11-23 1996-08-13 Schwartz Electro-Optics, Inc. Intelligent vehicle highway system sensor and method
US5793491A (en) 1992-12-30 1998-08-11 Schwartz Electro-Optics, Inc. Intelligent vehicle highway system multi-lane sensor and method
US5314037A (en) 1993-01-22 1994-05-24 Shaw David C H Automobile collision avoidance system
US5465142A (en) 1993-04-30 1995-11-07 Northrop Grumman Corporation Obstacle avoidance system for helicopters and other aircraft
DE69413761T2 (en) 1993-07-29 1999-07-08 Omron Corp., Kyoto Electromagnetic wave transmitter and rangefinder
DE4406821A1 (en) 1994-03-02 1995-09-07 Hipp Johann Device for guiding the pilot of an aircraft approaching its parking position
DE4411448C5 (en) 1994-03-31 2009-05-14 Sick Ag Method and device for controlling a given monitoring area
US5526291A (en) 1994-09-08 1996-06-11 Trimble Navigation Limited Compensation for receiver and satellite signal differences
JP3130223B2 (en) 1994-11-18 2001-01-31 三菱電機株式会社 Detection method and detection device
FR2730829B1 (en) 1995-02-22 2003-06-06 Asahi Optical Co Ltd DISTANCE MEASURING DEVICE
DE69633524T2 (en) 1995-04-12 2005-03-03 Matsushita Electric Industrial Co., Ltd., Kadoma Method and device for object detection
DE19517001A1 (en) 1995-05-09 1996-11-14 Sick Optik Elektronik Erwin Method and device for determining the light propagation time over a measuring section arranged between a measuring device and a reflecting object
US5691687A (en) 1995-07-03 1997-11-25 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Contactless magnetic slip ring
US5572219A (en) 1995-07-07 1996-11-05 General Electric Company Method and apparatus for remotely calibrating a phased array system used for satellite communication
DE19530281C2 (en) 1995-08-17 1999-01-07 Johann Hipp Device for optically detecting obstacles in front of vehicles
JP3619299B2 (en) 1995-09-29 2005-02-09 パイオニア株式会社 Light emitting element drive circuit
DE19539955A1 (en) 1995-10-26 1997-04-30 Sick Ag Optical detection device
DE19546563C2 (en) 1995-12-13 1997-09-18 Leica Ag Power supply for a pulse output stage
DE19607345A1 (en) 1996-02-27 1997-08-28 Sick Ag Laser distance determination device
US5988862A (en) 1996-04-24 1999-11-23 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three dimensional objects
JPH1075347A (en) 1996-06-24 1998-03-17 Nikon Corp Film image reader and storage medium storing control procedure to the film image reader
US5790244A (en) 1996-08-23 1998-08-04 Laser Technology, Inc. Pre-biasing technique for a transistor based avalanche circuit in a laser based distance measurement and ranging instrument
GB9620001D0 (en) 1996-09-25 1996-11-13 Firearms Research Ltd Optical sighting devices
DE19647152A1 (en) 1996-11-14 1998-05-28 Sick Ag Laser distance determination device
US5847817A (en) 1997-01-14 1998-12-08 Mcdonnell Douglas Corporation Method for extending range and sensitivity of a fiber optic micro-doppler ladar system and apparatus therefor
DE19701803A1 (en) 1997-01-20 1998-10-01 Sick Ag Light sensor with light transit time evaluation
DE19704340A1 (en) 1997-02-05 1998-08-06 Sick Ag Rangefinder
JP3456120B2 (en) 1997-09-09 2003-10-14 三菱電機株式会社 Power control device for laser diode
US6420698B1 (en) 1997-04-24 2002-07-16 Cyra Technologies, Inc. Integrated system for quickly and accurately imaging and modeling three-dimensional objects
US6034803A (en) 1997-04-30 2000-03-07 K2 T, Inc. Method and apparatus for directing energy based range detection sensor
US6621764B1 (en) 1997-04-30 2003-09-16 Thomas Smith Weapon location by acoustic-optic sensor fusion
DE19735037C2 (en) 1997-08-13 1999-06-02 Schmersal Eot Gmbh & Co Kg Device for locating objects entering a room area to be monitored
US7962285B2 (en) 1997-10-22 2011-06-14 Intelligent Technologies International, Inc. Inertial measurement unit for aircraft
JP3420049B2 (en) 1997-12-27 2003-06-23 本田技研工業株式会社 Vehicle object detection device
US5903386A (en) 1998-01-20 1999-05-11 Northrop Grumman Corporation Tilted primary clamshell lens laser scanner
DE19805606A1 (en) 1998-02-12 1999-09-02 Schmersal Eot Gmbh & Co Kg Procedure for configuring sensors
JPH11242518A (en) 1998-02-25 1999-09-07 Honda Motor Co Ltd Radar device
US6501393B1 (en) * 1999-09-27 2002-12-31 Time Domain Corporation System and method for using impulse radio technology to track and monitor vehicles
AUPP299498A0 (en) 1998-04-15 1998-05-07 Commonwealth Scientific And Industrial Research Organisation Method of tracking and sensing position of objects
US5953110A (en) 1998-04-23 1999-09-14 H.N. Burns Engineering Corporation Multichannel laser radar
US6529923B2 (en) 1998-05-29 2003-03-04 Cidra Corporation Method for improving the accuracy in the determination of a waveform center of a waveform signal
JP2000076454A (en) 1998-08-31 2000-03-14 Minolta Co Ltd Three-dimensional shape data processor
US6744800B1 (en) 1998-12-30 2004-06-01 Xerox Corporation Method and structure for nitride based laser diode arrays on an insulating substrate
US6137566A (en) 1999-02-24 2000-10-24 Eoo, Inc. Method and apparatus for signal processing in a laser radar receiver
US6441363B1 (en) 1999-02-24 2002-08-27 Siemens Vdo Automotive Corporation Vehicle occupant sensing system
DE50002356D1 (en) 1999-03-18 2003-07-03 Siemens Ag LOCAL DISTANCE MEASURING SYSTEM
KR20010007146A (en) 1999-06-01 2001-01-26 안자이 이치로 Heat sinks for cpus for use in personal computers
US6670905B1 (en) 1999-06-14 2003-12-30 Escort Inc. Radar warning receiver with position and velocity sensitive functions
US6836285B1 (en) 1999-09-03 2004-12-28 Arete Associates Lidar with streak-tube imaging,including hazard detection in marine applications; related optics
US6297844B1 (en) 1999-11-24 2001-10-02 Cognex Corporation Video safety curtain
US6794725B2 (en) 1999-12-21 2004-09-21 Xerox Corporation Amorphous silicon sensor with micro-spring interconnects for achieving high uniformity in integrated light-emitting sources
US6650402B2 (en) 2000-02-10 2003-11-18 Oceanit Laboratories, Inc. Omni-directional cloud height indicator
US7129971B2 (en) 2000-02-16 2006-10-31 Immersive Media Company Rotating scan self-cleaning camera
DE10027239A1 (en) 2000-05-31 2001-12-06 Sick Ag Distance measuring method and distance measuring device
US6664529B2 (en) 2000-07-19 2003-12-16 Utah State University 3D multispectral lidar
DE10043694A1 (en) 2000-09-04 2002-03-14 Bosch Gmbh Robert Method for adaptive knock control of a gasoline direct injection internal combustion engine and corresponding device
IL138683A0 (en) 2000-09-25 2001-10-31 Vital Medical Ltd Apparatus and method for monitoring tissue vitality parameters
US6329800B1 (en) 2000-10-17 2001-12-11 Sigmatel Method and apparatus for reducing power consumption in driver circuits
FR2817339B1 (en) 2000-11-24 2004-05-14 Mensi THREE-DIMENSIONAL LIFTING DEVICE OF A LASER EMISSION SCENE
US6441889B1 (en) 2000-11-29 2002-08-27 P.A.T.C.O. Properties, Inc. LIDAR with increased emitted laser power
US6682478B2 (en) 2001-02-08 2004-01-27 Olympus Optical Co., Ltd. Endoscope apparatus with an insertion part having a small outer diameter which includes and object optical system
DE10110416A1 (en) 2001-03-05 2002-09-12 Sick Ag Method and device for monitoring a protection zone
DE10110420A1 (en) 2001-03-05 2002-09-12 Sick Ag Device for determining a distance profile
US6396577B1 (en) 2001-03-19 2002-05-28 Thomas P. Ramstack Lidar-based air defense system
ES2250636T3 (en) 2001-04-04 2006-04-16 Instro Precision Limited IMAGE ANALYSIS TEAM.
US6798527B2 (en) 2001-04-27 2004-09-28 Minolta Co., Ltd. Three-dimensional shape-measuring system
US6593582B2 (en) 2001-05-11 2003-07-15 Science & Engineering Services, Inc. Portable digital lidar system
DE10127204A1 (en) 2001-06-05 2003-03-20 Ibeo Automobile Sensor Gmbh Registration procedure and device
EP1405100B1 (en) 2001-06-15 2008-09-24 IBEO Automobile Sensor GmbH Method for correcting data of several opto-electronic sensors
US20040247157A1 (en) 2001-06-15 2004-12-09 Ulrich Lages Method for preparing image information
US6844924B2 (en) 2001-06-29 2005-01-18 The United States Of America As Represented By The Secretary Of The Army Ladar system for detecting objects
US6646725B1 (en) 2001-07-11 2003-11-11 Iowa Research Foundation Multiple beam lidar system for wind measurement
DE10138641A1 (en) 2001-08-07 2003-02-20 Ibeo Automobile Sensor Gmbh Fast and reliable determination of a model or optimum-driving course for a motor vehicle based on information relating to the actual path of a road obtained using an optoelectronic scanning or sensing system
US6804693B2 (en) 2001-08-14 2004-10-12 Cidra Corporation Method for reducing skew in a real-time centroid calculation
US7190465B2 (en) 2001-08-30 2007-03-13 Z + F Zoller & Froehlich Gmbh Laser measurement system
DE10143061A1 (en) 2001-09-03 2003-03-20 Sick Ag Optoelectronic distance measuring device
DE10143060A1 (en) 2001-09-03 2003-03-20 Sick Ag Vehicle laser scanner transmits wide beam front towards moving deflector, causing reflective front to adopt various orientations in scanned space
US6542227B2 (en) 2001-09-04 2003-04-01 Rosemount Aerospace, Inc. System and method of measuring flow velocity in three axes
US6665063B2 (en) 2001-09-04 2003-12-16 Rosemount Aerospace Inc. Distributed laser obstacle awareness system
US6556282B2 (en) 2001-09-04 2003-04-29 Rosemount Aerospace, Inc. Combined LOAS and LIDAR system
DE10153270A1 (en) 2001-10-29 2003-05-08 Sick Ag Optoelectronic distance measuring device
AT412028B (en) 2001-11-09 2004-08-26 Riegl Laser Measurement Sys DEVICE FOR RECORDING AN OBJECT SPACE
US7489865B2 (en) 2002-02-01 2009-02-10 Cubic Corporation Integrated optical communication and range finding system and applications thereof
US6741341B2 (en) 2002-02-04 2004-05-25 Bae Systems Information And Electronic Systems Integration Inc Reentry vehicle interceptor with IR and variable FOV laser radar
US20030163030A1 (en) 2002-02-25 2003-08-28 Arriaga Moises A. Hollow endoscopy
US7868665B2 (en) 2002-03-05 2011-01-11 Nova R&D, Inc. Integrated circuit and sensor for imaging
IL148795A0 (en) 2002-03-20 2002-09-12 Vital Medical Ltd Apparatus and method for monitoring tissue vitality parameters for the diagnosis of body metabolic emergency state
WO2003088485A1 (en) 2002-04-10 2003-10-23 The Johns Hopkins University The time of flight system on a chip
US6876790B2 (en) 2002-05-17 2005-04-05 Science & Engineering Services, Inc. Method of coupling a laser signal to an optical carrier
DE10230397A1 (en) 2002-07-05 2004-01-15 Sick Ag laser scanning
DE10244641A1 (en) 2002-09-25 2004-04-08 Ibeo Automobile Sensor Gmbh Optoelectronic position monitoring system for road vehicle has two pulsed lasers, sensor and mechanical scanner with mirror at 45 degrees on shaft with calibration disk driven by electric motor
US20040066500A1 (en) 2002-10-02 2004-04-08 Gokturk Salih Burak Occupancy detection and measurement system and method
US7139459B2 (en) * 2002-10-16 2006-11-21 Lake Shore Cryotronics, Inc. Spectral filter for green and longer wavelengths
US6879419B2 (en) 2002-12-05 2005-04-12 Northrop Grumman Corporation Laser scanner with peripheral scanning capability
DE10258794A1 (en) 2002-12-16 2004-06-24 Ibeo Automobile Sensor Gmbh Detecting/tracking objects, e.g. before vehicles, involves using object profile from image points to predict contours for objects in preceding cycle starting from respective profile in preceding cycle
US6781677B1 (en) 2003-01-31 2004-08-24 The Boeing Company Laser range finding apparatus
EP1595131A4 (en) 2003-02-10 2008-11-26 Univ Virginia System and method for remote sensing and/or analyzing spectral properties of targets and/or chemical speicies for detection and identification thereof
US7248342B1 (en) 2003-02-14 2007-07-24 United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Three-dimension imaging lidar
GB2398841A (en) 2003-02-28 2004-09-01 Qinetiq Ltd Wind turbine control having a Lidar wind speed measurement apparatus
US7106424B2 (en) 2003-03-11 2006-09-12 Rosemount Aerospace Inc. Compact laser altimeter system
US20040213463A1 (en) 2003-04-22 2004-10-28 Morrison Rick Lee Multiplexed, spatially encoded illumination system for determining imaging and range estimation
US7379559B2 (en) 2003-05-28 2008-05-27 Trw Automotive U.S. Llc Method and apparatus for determining an occupant's head location in an actuatable occupant restraining system
US7089114B1 (en) 2003-07-03 2006-08-08 Baojia Huang Vehicle collision avoidance system and method
DE10360889A1 (en) 2003-12-19 2005-07-14 Robert Bosch Gmbh System with two or more sensors
JP3908226B2 (en) 2004-02-04 2007-04-25 日本電産株式会社 Scanning range sensor
US7373473B2 (en) 2004-03-10 2008-05-13 Leica Geosystems Hds Llc System and method for efficient storage and manipulation of extremely large amounts of scan data
US7323670B2 (en) 2004-03-16 2008-01-29 Leica Geosystems Hds Llc Laser operation for survey instruments
US8042056B2 (en) 2004-03-16 2011-10-18 Leica Geosystems Ag Browsers for large geometric data visualization
US7187823B2 (en) 2004-03-16 2007-03-06 Leica Geosystems Hds Llc Contact-free slip ring for survey instrumentation
US7583364B1 (en) 2004-03-19 2009-09-01 University Corporation For Atmospheric Research High pulse-energy, eye-safe lidar system
WO2005100613A2 (en) 2004-04-13 2005-10-27 Hyo Sang Lee Ultraviolet lidar for detection of biological warfare agents
DE102004018813A1 (en) 2004-04-19 2006-02-23 Ibeo Automobile Sensor Gmbh Method for detecting and / or tracking objects
CA2505715A1 (en) 2004-05-03 2005-11-03 Her Majesty In Right Of Canada As Represented By The Minister Of National Defence Volumetric sensor for mobile robotics
JP2005321403A (en) 2004-05-10 2005-11-17 Ibeo Automobile Sensor Gmbh Method and device for measuring distance
US7240314B1 (en) 2004-06-04 2007-07-03 Magma Design Automation, Inc. Redundantly tied metal fill for IR-drop and layout density optimization
DE102004033114A1 (en) 2004-07-08 2006-01-26 Ibeo Automobile Sensor Gmbh Method for calibrating a distance image sensor
US7667769B2 (en) 2004-07-12 2010-02-23 Honeywell International Inc. Rotatable wireless electrical coupler
DE102004044973B4 (en) 2004-09-16 2014-12-04 Sick Ag Control of a surveillance area
US20060100783A1 (en) 2004-10-21 2006-05-11 Sick Ag Monitoring the surroundings of a vehicle
US8078338B2 (en) 2004-10-22 2011-12-13 Irobot Corporation System and method for behavior based control of an autonomous vehicle
WO2006083349A2 (en) 2004-11-19 2006-08-10 Science & Engineering Services, Inc. Enhanced portable digital lidar system
EP1672382A1 (en) 2004-12-18 2006-06-21 Leica Geosystems AG Method for single channel heterodyne distance measurement
US7688374B2 (en) 2004-12-20 2010-03-30 The United States Of America As Represented By The Secretary Of The Army Single axis CCD time gated ladar sensor
WO2006076731A1 (en) 2005-01-12 2006-07-20 University Of Florida Research Foundation, Inc. Full circumferential scanning oct intravascular imaging probe based on scanning mems miror
US20060176697A1 (en) 2005-02-08 2006-08-10 Arruda Steven S Combination light fixture and motion sensor apparatus
US20060186326A1 (en) 2005-02-21 2006-08-24 Takashi Ito Wave receiving apparatus and distance measuring apparatus
US20060197867A1 (en) 2005-03-02 2006-09-07 Peter Johnson Imaging head and imaging system
US8139685B2 (en) 2005-05-10 2012-03-20 Qualcomm Incorporated Systems, methods, and apparatus for frequency control
US8451432B2 (en) 2005-06-09 2013-05-28 Analog-Modules, Inc. Laser spot tracking with off-axis angle detection
US8203702B1 (en) 2005-06-13 2012-06-19 ARETé ASSOCIATES Optical system
US20080002176A1 (en) 2005-07-08 2008-01-03 Lockheed Martin Corporation Lookdown and loitering ladar system
US20070071056A1 (en) 2005-09-09 2007-03-29 Ye Chen Laser ranging with large-format VCSEL array
US7511800B2 (en) 2005-11-28 2009-03-31 Robert Bosch Company Limited Distance measurement device with short range optics
DE102006002376A1 (en) 2006-01-17 2007-07-19 Robert Bosch Gmbh Method and device for transit traffic detection
US7358819B2 (en) 2006-01-17 2008-04-15 Rockwell Automation Technologies, Inc. Reduced-size sensor circuit
US7489186B2 (en) 2006-01-18 2009-02-10 International Rectifier Corporation Current sense amplifier for voltage converter
US7544945B2 (en) 2006-02-06 2009-06-09 Avago Technologies General Ip (Singapore) Pte. Ltd. Vertical cavity surface emitting laser (VCSEL) array laser scanner
US20070201027A1 (en) 2006-02-07 2007-08-30 Doushkina Valentina V Innovative Raster-Mirror Optical Detection System For Bistatic Lidar
US7826117B2 (en) 2006-02-20 2010-11-02 Sanyo Electric Co., Ltd. Beam irradiation apparatus
US7944548B2 (en) 2006-03-07 2011-05-17 Leica Geosystems Ag Increasing measurement rate in time of flight measurement apparatuses
US8050863B2 (en) 2006-03-16 2011-11-01 Gray & Company, Inc. Navigation and control system for autonomous vehicles
DE202006005643U1 (en) 2006-03-31 2006-07-06 Faro Technologies Inc., Lake Mary Device for three-dimensional detection of a spatial area
US7501616B2 (en) 2006-05-25 2009-03-10 Microvision, Inc. Method and apparatus for capturing an image of a moving object
DE102006027063A1 (en) 2006-06-10 2007-12-13 Sick Ag scanner
US20080013896A1 (en) 2006-06-28 2008-01-17 Salzberg Jose B Miniature optical transceiver
EP1876415B1 (en) 2006-07-03 2010-05-12 Trimble AB A surveying instrument and method of controlling a surveying instrument
DE102006031580A1 (en) 2006-07-03 2008-01-17 Faro Technologies, Inc., Lake Mary Method and device for the three-dimensional detection of a spatial area
USRE46672E1 (en) 2006-07-13 2018-01-16 Velodyne Lidar, Inc. High definition LiDAR system
EP2041515A4 (en) 2006-07-13 2009-11-11 Velodyne Acoustics Inc High definition lidar system
EP1895318B1 (en) 2006-08-28 2009-12-16 IBEO Automobile Sensor GmbH Method for determining the global position
US7701558B2 (en) 2006-09-22 2010-04-20 Leica Geosystems Ag LIDAR system
KR100758987B1 (en) 2006-09-26 2007-09-17 삼성전자주식회사 A led lighting device and a method for controlling the same
DE102006060108A1 (en) 2006-12-20 2008-06-26 Sick Ag laser scanner
US20080170826A1 (en) 2007-01-16 2008-07-17 Applied Optical Materials Misalignment-tolerant optical coupler/connector
US8953647B1 (en) 2007-03-21 2015-02-10 Lockheed Martin Corporation High-power laser using thulium-doped fiber amplifier and frequency quadrupling for blue output
US8767215B2 (en) 2007-06-18 2014-07-01 Leddartech Inc. Method for detecting objects with light
US8063415B2 (en) 2007-07-25 2011-11-22 Renesas Electronics Corporation Semiconductor device
US7944420B2 (en) 2007-09-28 2011-05-17 Osram Sylvania Inc. Light emitting diode driver providing current and power control
TWI358606B (en) 2007-12-28 2012-02-21 Ind Tech Res Inst Method for three-dimension (3d) measurement and an
JP5376707B2 (en) 2008-01-24 2013-12-25 株式会社半導体エネルギー研究所 Laser annealing equipment
US7642946B2 (en) 2008-04-07 2010-01-05 Broadcom Corporation Successive approximation analog to digital converter
WO2009136184A2 (en) 2008-04-18 2009-11-12 Bae Systems Plc Improvements in lidars
US8301027B2 (en) 2008-05-02 2012-10-30 Massachusetts Institute Of Technology Agile-beam laser array transmitter
US8311067B2 (en) 2008-06-12 2012-11-13 Akonia Holographics, Llc System and devices for improving external cavity diode lasers using wavelength and mode sensors and compact optical paths
US8466725B2 (en) 2008-08-13 2013-06-18 Pierre F. Thibault Method and device for generating short pulses
IL200332A0 (en) 2008-08-19 2010-04-29 Rosemount Aerospace Inc Lidar system using a pseudo-random pulse sequence
KR101781399B1 (en) 2008-11-17 2017-09-25 익스프레스 이미징 시스템즈, 엘엘씨 Electronic control to regulate power for solid-state lighting and methods thereof
JP5688876B2 (en) 2008-12-25 2015-03-25 株式会社トプコン Calibration method for laser scanner measurement system
US20100204964A1 (en) 2009-02-09 2010-08-12 Utah State University Lidar-assisted multi-image matching for 3-d model and sensor pose refinement
US8717545B2 (en) 2009-02-20 2014-05-06 Digital Signal Corporation System and method for generating three dimensional images using lidar and video measurements
US8761465B2 (en) 2009-03-18 2014-06-24 Microsoft Corporation Centroid processing
US8447563B2 (en) 2009-03-31 2013-05-21 The United States Of America As Represented By The Secretary Of The Navy Method and system for determination of detection probability or a target object based on a range
US8077047B2 (en) 2009-04-16 2011-12-13 Ut-Battelle, Llc Tampering detection system using quantum-mechanical systems
US8542252B2 (en) * 2009-05-29 2013-09-24 Microsoft Corporation Target digitization, extraction, and tracking
US8675181B2 (en) 2009-06-02 2014-03-18 Velodyne Acoustics, Inc. Color LiDAR scanner
WO2010144961A1 (en) 2009-06-17 2010-12-23 Stephen Woodford Determining haemodynamic performance
US20110028859A1 (en) * 2009-07-31 2011-02-03 Neuropace, Inc. Methods, Systems and Devices for Monitoring a Target in a Neural System and Facilitating or Controlling a Cell Therapy
WO2011044629A1 (en) 2009-10-14 2011-04-21 Newsouth Innovations Pty Limited Location verification in quantum communications
US8760631B2 (en) 2010-01-27 2014-06-24 Intersil Americas Inc. Distance sensing by IQ domain differentiation of time of flight (TOF) measurements
DE102010010097A1 (en) 2010-03-01 2011-09-01 Esw Gmbh Compact laser rangefinder
US20110305256A1 (en) 2010-03-05 2011-12-15 TeraDiode, Inc. Wavelength beam combining based laser pumps
EP3901653A3 (en) 2010-05-17 2022-03-02 Velodyne Lidar USA, Inc. High definition lidar system
US8605262B2 (en) 2010-06-23 2013-12-10 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Time shifted PN codes for CW LiDAR, radar, and sonar
US8736818B2 (en) 2010-08-16 2014-05-27 Ball Aerospace & Technologies Corp. Electronically steered flash LIDAR
WO2012056615A1 (en) 2010-10-26 2012-05-03 パナソニック株式会社 Semiconductor device
US20130286404A1 (en) 2010-11-16 2013-10-31 Thunder Bay Regional Research Institute Methods and apparatus for alignment of interferometer
EP2641102A4 (en) 2010-11-19 2014-08-13 Nokia Corp Handling complex signal parameters
JP5822255B2 (en) 2011-04-14 2015-11-24 株式会社豊田中央研究所 Object identification device and program
US8976340B2 (en) 2011-04-15 2015-03-10 Advanced Scientific Concepts, Inc. Ladar sensor for landing, docking and approach
US8908159B2 (en) 2011-05-11 2014-12-09 Leddartech Inc. Multiple-field-of-view scannerless optical rangefinder in high ambient background light
DE102011076493A1 (en) 2011-05-26 2012-11-29 Hilti Aktiengesellschaft Measuring device for distance measurement
US9059562B2 (en) 2011-06-23 2015-06-16 Daylight Solutions, Inc. Control system for directing power to a laser assembly
US9069061B1 (en) 2011-07-19 2015-06-30 Ball Aerospace & Technologies Corp. LIDAR with analog memory
US9288513B2 (en) 2011-08-29 2016-03-15 Aerovironment, Inc. System and method of high-resolution digital data image transmission
US8907921B2 (en) 2011-08-30 2014-12-09 Synaptics Incorporated Interference sensing within a display device with an integrated sensing device
US9453914B2 (en) 2011-09-08 2016-09-27 Continental Advanced Lidar Solutions Us, Inc. Terrain mapping LADAR system
WO2013053952A1 (en) 2011-10-14 2013-04-18 Iee International Electronics & Engineering S.A. Spatially selective detection using a dynamic mask in an image plane
US9217415B2 (en) 2011-10-14 2015-12-22 Vestas Wind Systems A/S Estimation of wind properties using a light detection and ranging device
EP2607924A1 (en) 2011-12-23 2013-06-26 Leica Geosystems AG Distance sensor adjustment
US8754412B2 (en) 2012-01-03 2014-06-17 International Business Machines Corporation Intra die variation monitor using through-silicon via
US9651417B2 (en) 2012-02-15 2017-05-16 Apple Inc. Scanning depth engine
US9915726B2 (en) 2012-03-16 2018-03-13 Continental Advanced Lidar Solutions Us, Llc Personal LADAR sensor
US20130241761A1 (en) 2012-03-16 2013-09-19 Nikon Corporation Beam steering for laser radar and other uses
US8804101B2 (en) 2012-03-16 2014-08-12 Advanced Scientific Concepts, Inc. Personal LADAR sensor
US8994925B2 (en) 2012-03-27 2015-03-31 Pulsedlight, Inc. Optical distance measurement device
US20160191173A1 (en) 2012-04-10 2016-06-30 Robert Anderson Malaney Location Verification in Quantum Communications
US9246041B1 (en) 2012-04-26 2016-01-26 Id Quantique Sa Apparatus and method for allowing avalanche photodiode based single-photon detectors to be driven by the same electrical circuit in gated and in free-running modes
US9349263B2 (en) 2012-06-22 2016-05-24 GM Global Technology Operations LLC Alert systems and methods for a vehicle
CN107104649B (en) 2012-08-15 2020-10-13 天工方案公司 Radio frequency power amplifier control circuit and method, radio frequency module and radio frequency device
US9081096B2 (en) 2012-08-31 2015-07-14 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Apparatus, method, and computer program for a resolution-enhanced pseudo-noise code technique
EP4221187A3 (en) 2012-09-10 2023-08-09 Aemass, Inc. Multi-dimensional data capture of an environment using plural devices
CN104620129A (en) * 2012-09-14 2015-05-13 法罗技术股份有限公司 Laser scanner with dynamical adjustment of angular scan velocity
US9383753B1 (en) 2012-09-26 2016-07-05 Google Inc. Wide-view LIDAR with areas of special attention
US9442195B2 (en) 2012-10-11 2016-09-13 Lumentum Operations Llc Power efficient pulsed laser driver for time of flight cameras
US9151940B2 (en) 2012-12-05 2015-10-06 Kla-Tencor Corporation Semiconductor inspection and metrology system using laser pulse multiplier
US9285477B1 (en) 2013-01-25 2016-03-15 Apple Inc. 3D depth point cloud from timing flight of 2D scanned light beam pulses
US9083960B2 (en) * 2013-01-30 2015-07-14 Qualcomm Incorporated Real-time 3D reconstruction with power efficient depth sensor usage
US9297756B2 (en) * 2013-02-01 2016-03-29 Battelle Memorial Institute Capillary absorption spectrometer and process for isotopic analysis of small samples
KR102048361B1 (en) 2013-02-28 2019-11-25 엘지전자 주식회사 Distance detecting device and Image processing apparatus including the same
US9250327B2 (en) 2013-03-05 2016-02-02 Subcarrier Systems Corporation Method and apparatus for reducing satellite position message payload by adaptive data compression techniques
US9063549B1 (en) 2013-03-06 2015-06-23 Google Inc. Light detection and ranging device with oscillating mirror driven by magnetically interactive coil
US9086273B1 (en) 2013-03-08 2015-07-21 Google Inc. Microrod compression of laser beam in combination with transmit lens
US9110169B2 (en) 2013-03-08 2015-08-18 Advanced Scientific Concepts, Inc. LADAR enabled impact mitigation system
US9319916B2 (en) 2013-03-15 2016-04-19 Isco International, Llc Method and appartus for signal interference processing
US9215430B2 (en) 2013-03-15 2015-12-15 Omnivision Technologies, Inc. Image sensor with pixels having increased optical crosstalk
US9239959B1 (en) 2013-04-08 2016-01-19 Lockheed Martin Corporation Multi-resolution, wide field-of-view, unmanned ground vehicle navigation sensor
US10132928B2 (en) 2013-05-09 2018-11-20 Quanergy Systems, Inc. Solid state optical phased array lidar and method of using same
US9069080B2 (en) 2013-05-24 2015-06-30 Advanced Scientific Concepts, Inc. Automotive auxiliary ladar sensor
US9113154B2 (en) 2013-07-10 2015-08-18 Faro Technologies, Inc. Three-dimensional measurement device having three-dimensional overview camera
US9629220B2 (en) 2013-08-05 2017-04-18 Peter Panopoulos Sensor-based controllable LED lighting system with repositionable components and method
US10126412B2 (en) 2013-08-19 2018-11-13 Quanergy Systems, Inc. Optical phased array lidar system and method of using same
US8836922B1 (en) 2013-08-20 2014-09-16 Google Inc. Devices and methods for a rotating LIDAR platform with a shared transmit/receive path
JP6440151B2 (en) 2013-08-23 2018-12-19 シクパ ホルディング ソシエテ アノニムSicpa Holding Sa Method and system for authenticating a device
US9684066B2 (en) 2013-10-28 2017-06-20 Texas Instruments Incorporated Light radar signal processing apparatus, systems and methods
US10203399B2 (en) 2013-11-12 2019-02-12 Big Sky Financial Corporation Methods and apparatus for array based LiDAR systems with reduced interference
CA2931055C (en) 2013-11-22 2022-07-12 Ottomotto Llc Lidar scanner calibration
KR101770872B1 (en) 2013-12-27 2017-08-23 주식회사 만도 TOF camera for vehicle and method for driving thereof
KR20150095033A (en) 2014-02-12 2015-08-20 한국전자통신연구원 Laser radar apparatus and method for acquiring image thereof
US9110154B1 (en) 2014-02-19 2015-08-18 Raytheon Company Portable programmable ladar test target
US8995478B1 (en) 2014-04-08 2015-03-31 Tekhnoscan-Lab LLC Passively mode-locked pulsed fiber laser
US9360554B2 (en) 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
US9286538B1 (en) 2014-05-01 2016-03-15 Hrl Laboratories, Llc Adaptive 3D to 2D projection for different height slices and extraction of robust morphological features for 3D object recognition
US9575184B2 (en) 2014-07-03 2017-02-21 Continental Advanced Lidar Solutions Us, Inc. LADAR sensor for a dense environment
US9759809B2 (en) 2014-07-08 2017-09-12 Sikorsky Aircraft Corporation LIDAR-based shipboard tracking and state estimation for autonomous landing
US9531928B2 (en) 2014-07-08 2016-12-27 Flir Systems, Inc. Gimbal system with imbalance compensation
US9342968B2 (en) 2014-08-12 2016-05-17 Tyco Fire & Security Gmbh Electronic article surveillance systems implementing methods for determining security tag locations
WO2016036961A1 (en) 2014-09-05 2016-03-10 Halliburton Energy Services, Inc. Electromagnetic signal booster
US9734276B2 (en) 2014-10-22 2017-08-15 Samsung Electronics Co., Ltd. Integrated circuit and method of designing layout of the same
US10036801B2 (en) 2015-03-05 2018-07-31 Big Sky Financial Corporation Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array
US9625582B2 (en) * 2015-03-25 2017-04-18 Google Inc. Vehicle with multiple light detection and ranging devices (LIDARs)
US9529079B1 (en) 2015-03-26 2016-12-27 Google Inc. Multiplexed multichannel photodetector
DE102015004272B4 (en) 2015-04-07 2019-05-29 Metek Meteorologische Messtechnik Gmbh Distortion-tolerant Lidar measuring system and interfering light-tolerant Lidar measuring method
WO2016164455A1 (en) 2015-04-09 2016-10-13 Avaz Surgical, Llc Device and system for placing securing device within bone
US10436904B2 (en) * 2015-04-15 2019-10-08 The Boeing Company Systems and methods for modular LADAR scanning
JP6554310B2 (en) 2015-04-28 2019-07-31 浜松ホトニクス株式会社 Distance measuring device
WO2016191367A1 (en) 2015-05-22 2016-12-01 Massachusetts Institute Of Technology Rapid and precise optically multiplexed imaging
US10591592B2 (en) 2015-06-15 2020-03-17 Humatics Corporation High-precision time of flight measurement systems
US9866208B2 (en) 2015-06-15 2018-01-09 Microsoft Technology Lincensing, LLC Precision measurements and calibrations for timing generators
KR20240010086A (en) 2015-09-09 2024-01-23 일렉트로 싸이언티픽 인더스트리이즈 인코포레이티드 Laser processing apparatus, methods of laser-processing workpieces and related arrangements
US10215846B2 (en) 2015-11-20 2019-02-26 Texas Instruments Incorporated Compact chip scale LIDAR solution
US10539661B2 (en) 2015-11-25 2020-01-21 Velodyne Lidar, Inc. Three dimensional LIDAR system with targeted field of view
EP3411660A4 (en) 2015-11-30 2019-11-27 Luminar Technologies, Inc. Lidar system with distributed laser and multiple sensor heads and pulsed laser for lidar system
US10127685B2 (en) * 2015-12-16 2018-11-13 Objectvideo Labs, Llc Profile matching of buildings and urban structures
KR20200110823A (en) 2016-01-29 2020-09-25 각코호진 메이지다이가쿠 The laser scan system, the laser scan method, and the movement laser scan system and program
US9904867B2 (en) * 2016-01-29 2018-02-27 Pointivo, Inc. Systems and methods for extracting information about objects from scene information
US10627490B2 (en) 2016-01-31 2020-04-21 Velodyne Lidar, Inc. Multiple pulse, LIDAR based 3-D imaging
US10641872B2 (en) 2016-02-18 2020-05-05 Aeye, Inc. Ladar receiver with advanced optics
JP7149256B2 (en) 2016-03-19 2022-10-06 ベロダイン ライダー ユーエスエー,インコーポレイテッド Integrated illumination and detection for LIDAR-based 3D imaging
JP7258554B2 (en) 2016-03-21 2023-04-17 ベロダイン ライダー ユーエスエー,インコーポレイテッド Three-dimensional imaging based on LIDAR with variable irradiation field density
CN108885263B (en) 2016-03-21 2024-02-23 威力登激光雷达有限公司 LIDAR-based 3D imaging with variable pulse repetition
US10393877B2 (en) 2016-06-01 2019-08-27 Velodyne Lidar, Inc. Multiple pixel scanning LIDAR
US11270904B2 (en) 2016-07-12 2022-03-08 Brooks Automation Us, Llc Substrate processing apparatus
US20180059219A1 (en) 2016-08-31 2018-03-01 Qualcomm Incorporated Multi-beam position sensing devices
KR102698290B1 (en) 2016-09-09 2024-08-23 삼성전자주식회사 Phase modulation active device, method of driving the same and optical apparatus including the phase modulation active device
US10109183B1 (en) 2016-12-30 2018-10-23 Panosense Inc. Interface for transferring data between a non-rotating body and a rotating body
EP3583384A4 (en) 2017-03-20 2021-01-13 Velodyne Lidar, Inc. Lidar based 3-d imaging with structured light and integrated illumination and detection
US9869754B1 (en) 2017-03-22 2018-01-16 Luminar Technologies, Inc. Scan patterns for lidar systems
US10574967B2 (en) * 2017-03-23 2020-02-25 The Boeing Company Autonomous performance of an operation on an object using a generated dense 3D model of the object
US9989629B1 (en) 2017-03-30 2018-06-05 Luminar Technologies, Inc. Cross-talk mitigation using wavelength switching
US20180284246A1 (en) 2017-03-31 2018-10-04 Luminar Technologies, Inc. Using Acoustic Signals to Modify Operation of a Lidar System
JP7290571B2 (en) 2017-03-31 2023-06-13 ベロダイン ライダー ユーエスエー,インコーポレイテッド Integrated LIDAR lighting output control
US10460180B2 (en) 2017-04-20 2019-10-29 GM Global Technology Operations LLC Systems and methods for visual classification with region proposals
JP2020519881A (en) 2017-05-08 2020-07-02 ベロダイン ライダー, インク. LIDAR data collection and control
CN110869750A (en) 2017-06-14 2020-03-06 优比库德股份有限公司 Optical fiber coupling broadband light source
US10670725B2 (en) * 2017-07-25 2020-06-02 Waymo Llc Determining yaw error from map data, lasers, and cameras
US10003168B1 (en) 2017-10-18 2018-06-19 Luminar Technologies, Inc. Fiber laser with free-space components
US11294041B2 (en) 2017-12-08 2022-04-05 Velodyne Lidar Usa, Inc. Systems and methods for improving detection of a return signal in a light ranging and detection system
US11971507B2 (en) 2018-08-24 2024-04-30 Velodyne Lidar Usa, Inc. Systems and methods for mitigating optical crosstalk in a light ranging and detection system
US11082010B2 (en) 2018-11-06 2021-08-03 Velodyne Lidar Usa, Inc. Systems and methods for TIA base current detection and compensation
US10613203B1 (en) 2019-07-01 2020-04-07 Velodyne Lidar, Inc. Interference mitigation for light detection and ranging

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5563706A (en) * 1993-08-24 1996-10-08 Nikon Corporation Interferometric surface profiler with an alignment optical member
US6201236B1 (en) * 1997-11-13 2001-03-13 Auto Sense Ltd. Detection system with improved noise tolerance
US7131586B2 (en) * 2000-06-07 2006-11-07 Metrologic Instruments, Inc. Method of and apparatus for reducing speckle-pattern noise in a planar laser illumination and imaging (PLIIM) based system
US20150144806A1 (en) * 2012-05-29 2015-05-28 Macquarie University Two-directional scanning for luminescence microscopy
US20150109290A1 (en) 2013-10-22 2015-04-23 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Device and method for removing noise points in point clouds
US20160252617A1 (en) 2015-02-27 2016-09-01 Denso Corporation Object recognition apparatus and noise removal method
US20160327646A1 (en) 2015-05-07 2016-11-10 GM Global Technology Operations LLC Pseudo random sequences in array lidar systems
US20180058197A1 (en) * 2015-12-28 2018-03-01 Halliburton Energy Services, Inc. Distributed optical sensing using compressive sampling
WO2018129408A1 (en) 2017-01-05 2018-07-12 Innovusion Ireland Limited Method and system for encoding and decoding lidar
US20190056497A1 (en) 2017-03-01 2019-02-21 Ouster, Inc. Accurate photo detector measurements for lidar

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3973316A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022550200A (en) * 2019-10-02 2022-11-30 セプトン テクノロジーズ,インコーポレイテッド Crosstalk Interference Detection Technology in Lidar Imaging Sensors
JP7474457B2 (en) 2019-10-02 2024-04-25 セプトン テクノロジーズ,インコーポレイテッド Crosstalk interference detection technology in lidar imaging sensors
US12007480B2 (en) 2019-10-02 2024-06-11 Cepton Technologies, Inc. Techniques for detecting cross-talk interferences in LiDAR imaging sensors with multiple light sources

Also Published As

Publication number Publication date
KR20220025872A (en) 2022-03-03
JP2022540400A (en) 2022-09-15
EP3973316A4 (en) 2023-10-25
IL289131A (en) 2022-02-01
CA3144656A1 (en) 2021-01-07
MX2021016061A (en) 2022-04-18
CN114270215A (en) 2022-04-01
US10613203B1 (en) 2020-04-07
US20210003681A1 (en) 2021-01-07
IL289131B2 (en) 2023-08-01
IL289131B1 (en) 2023-04-01
EP3973316A1 (en) 2022-03-30
JP7546647B2 (en) 2024-09-06
US11906670B2 (en) 2024-02-20

Similar Documents

Publication Publication Date Title
US10613203B1 (en) Interference mitigation for light detection and ranging
KR102664105B1 (en) Systems and methods for improving detection of return signals in optical ranging and detection systems
CN109059902A (en) Relative pose determines method, apparatus, equipment and medium
US10509111B2 (en) LIDAR sensor device
JP5712900B2 (en) Peripheral object detection device
US20180121750A1 (en) Method to provide a vehicle environment contour polyline from detection data
CN109901141B (en) Calibration method and device
CN108780149B (en) Method for improving the detection of at least one object in the surroundings of a motor vehicle by indirect measurement of a sensor, control unit, driver assistance system and motor vehicle
AU2018373751A1 (en) Method and device for ascertaining an installation angle between a roadway on which a vehicle travels and a detection direction of a measurement or radar sensor
CN110986816B (en) Depth measurement system and measurement method thereof
CN113167893A (en) Improved echo signal detection in optical ranging and detection systems with pulse coding
US11513197B2 (en) Multiple-pulses-in-air laser scanning system with ambiguity resolution based on range probing and 3D point analysis
CN113466836A (en) Distance measurement method and device and laser radar
CN110954912B (en) Method and apparatus for optical distance measurement
WO2023019573A1 (en) Ranging method, waveform detection method, apparatus, and related device
JP6260418B2 (en) Distance measuring device, distance measuring method, and distance measuring program
EP3862787A1 (en) De-jitter of point cloud data for target recognition
KR102420585B1 (en) Apparatus and method for determining point cloud information in consideration of the operating environment of a light detection and ranging system
KR101896477B1 (en) Method and Apparatus for Scanning LiDAR
KR102287935B1 (en) Laser tracking system and method for extracting angular data of space objects
JP5767150B2 (en) Target size measuring device
WO2023023951A1 (en) Method for detecting anomalies of lidar point cloud data and related device
JP5767149B2 (en) Target size measuring device
US20230146935A1 (en) Content capture of an environment of a vehicle using a priori confidence levels
CN118731910A (en) Radar control method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20834220

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3144656

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2022500148

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020834220

Country of ref document: EP

Effective date: 20211221

ENP Entry into the national phase

Ref document number: 20227003050

Country of ref document: KR

Kind code of ref document: A