WO2019112514A1 - Rain filtering techniques for autonomous vehicle - Google Patents

Rain filtering techniques for autonomous vehicle Download PDF

Info

Publication number
WO2019112514A1
WO2019112514A1 PCT/SG2017/050607 SG2017050607W WO2019112514A1 WO 2019112514 A1 WO2019112514 A1 WO 2019112514A1 SG 2017050607 W SG2017050607 W SG 2017050607W WO 2019112514 A1 WO2019112514 A1 WO 2019112514A1
Authority
WO
WIPO (PCT)
Prior art keywords
lidar
lidar device
laser light
reflected
echoes
Prior art date
Application number
PCT/SG2017/050607
Other languages
French (fr)
Inventor
Yong Way CHEE
Original Assignee
St Engineering Land Systems Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by St Engineering Land Systems Ltd. filed Critical St Engineering Land Systems Ltd.
Priority to JP2020528321A priority Critical patent/JP2021514457A/en
Priority to PCT/SG2017/050607 priority patent/WO2019112514A1/en
Priority to AU2017442202A priority patent/AU2017442202A1/en
Priority to SG11202005246TA priority patent/SG11202005246TA/en
Priority to TW107144162A priority patent/TW201932868A/en
Publication of WO2019112514A1 publication Critical patent/WO2019112514A1/en
Priority to IL275162A priority patent/IL275162A/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems

Definitions

  • the present invention relates to a system and method of using lidar for autonomous vehicle navigation when particulate matter is present in the air such as rain or snow.
  • An autonomous car is one that is capable of monitoring its environment and navigating without human input. Benefits of autonomous cars include greater vehicle safety, greater efficiency, lower costs, decreased congestion and greater mobility for those who are unfit or not licensed to driver.
  • CMOS cameras can be used to detect objects and hazards in their surroundings.
  • each type of sensor has limitations.
  • CMOS cameras rely on adequate lighting to detect objects.
  • the cameras can be ineffective when light is projected back by, for example, headlights of an approaching vehicle.
  • Other shortcomings of cameras include inadequate lighting, motion blurring, limited field-of-view and/or detection range.
  • Radar does not rely on visible light but lacks the resolution needed to identify smaller objects.
  • Laser light can be used to overcome these limitations but can be ineffective during rain or snow.
  • Lidar Light Imaging Detection and Ranging measures the distance to an object by illuminating it with a pulsed laser light and measuring the reflected pulses with a sensor.
  • a lidar system reflects multiple laser pulses off of objects surrounding the vehicle. Differences in laser return times and wavelengths can then be used to make digital 3D representations of objects.
  • lidar it is difficult for lidar to transmit through snow, rain, fog and dust in the air. Rain or other particles in the air can reflect laser light back to the system in the same manner as a solid object. As pulsed laser light is reflected back to a lidar system, an autonomous vehicle will unnecessarily slow or stop. This is due to the“false positive” detection of obstacles.
  • lidar sensors with “multi-echo” or “last echo” technology can filter reflections caused by dust, rain and fog. They operate under the premise that a portion of the energy from a pulse may be reflected by particle matter. The remainder of the beam can continue to propagate and is reflected by a solid object. When this occurs, the lidar unit can ignore the closer, weaker reflections caused by rain or particulates in the air. While an improvement, this technique is not robust for filtering moderate to heavy rain.
  • U.S. Patent No. 9,097,800 describes a method of combining radar with lidar to detect objects.
  • the lidar is used to create a three-dimensional point map in an area surrounding the vehicle. Radar is then used to confirm that objects or hazards are solid materials rather than rain or airborne particles. While the system is designed to prevent false positive lidar detections, it requires the use of a separate radar system. Further, radar has inherently low resolution and can miss objects with weak reflectivity.
  • U.S. Patent Application No. 14/576,265 also describes a method of using lidar to confirm the presence of objects near an autonomous vehicle.
  • the lidar detects multiple points of an object (i.e. right, left, top and bottom) to confirm its presence.
  • the approach is similar to multi-echo methods as it requires multiple pulses.
  • U.S. Patent No. 9,097,800 it may not be effective in detecting small objects.
  • the invention recognizes that there is a need for an improved system for autonomous vehicles to navigate and detect objects and hazards.
  • the system should be reliable and capable of operating in all weather conditions.
  • the invention includes a system and rain filtering algorithm to allow autonomous vehicles to operate in rain or snow.
  • the invention includes a system for detecting object and/or navigating using laser light (i.e. lidar) as well as distinguishing solid objects from particles in the air.
  • the system can include a first lidar device that scans an area by emitting pulses of laser light and detecting laser light that is reflected and a second lidar device that scans the same or substantially the same area by emitting pulses of laser light and detecting laser light that is reflected.
  • a computer can use an algorithm to compare emitted laser light and reflected laser light detected by the first lidar device with emitted laser light and reflected laser light detected by the second lidar device to confirm the presence of a solid object.
  • the presence of a solid object can be confirmed when the first lidar device and the second lidar device detect reflected laser light from the solid object.
  • the first and second lidar devices can be mounted on an autonomous vehicle so that they are 0.5 to 2.5 meters apart from each other on a horizontal plane.
  • the first lidar device and the second lidar device can utilize multi-echo technology to distinguish solid objects from airborne particles.
  • the lidar devices can perform multiple plane scanning to distinguish solid objects from airborne particles.
  • the invention also includes a method of detecting objects or obstacles for autonomous driving.
  • a first lidar device scans an area near an autonomous vehicle by emitting laser light and detecting reflected laser light (i.e. echoes).
  • a second lidar device scans an area near an autonomous vehicle by emitting laser light and detecting echoes. Emitted pulses and echoes detected by the first lidar device are compared with emitted pulses and echoes detected by the second lidar device using a computer and an algorithm to detect and confirm the presence of an object or obstacle. The presence of a solid object can be confirmed when both the first lidar unit and the second lidar unit detect echoes from the object.
  • An echo that results from rain or snow can be identified by comparing echoes received by the first lidar device with echoes received by the second lidar device.
  • the first lidar device and the second lidar device can use multi-echo scanning. Further, the lidar devices can scan a series of layers or planes. The presence of a solid object can be confirmed when the solid object appears in one or more layers or planes.
  • the invention also includes a method of detecting solid objects comprised of the steps of (a) scanning an area near an autonomous vehicle by emitting pulses of laser light and detecting echoes with a first lidar unit, (b) scanning substantially the same area near by emitting pulses of laser light and detecting echoes with a second lidar unit and (c) using an algorithm to compare pulses emitted and echoes received by the first lidar unit with pulses emitted and echoes received by the second lidar unit to detect and confirm the presence of an object or obstacle.
  • a false positive i.e. particulate matter in the air such as rain or snow
  • the first lidar unit and the second lidar unit can use multi-echo scanning. They can also scan a series of planes. The presence of a solid object can be confirmed when the object appears in more than one plane. Furthermore, the presence of a solid object can be confirmed in the last echo of a multi-echo scan.
  • the invention also includes a method for detecting and/or confirming the presence of a solid object using lidar comprised of the steps of (a) using a first lidar device to scan multiple planes, wherein pulses of laser light are emitted and reflected laser light is detected, (b) collecting data related to laser light emitted from the first lidar device and reflected laser light detected by the first lidar device, (c) determining whether reflected light from the first lidar device is the result of particulate matter in the air by comparing multiple planes for the presence and/or absence of reflected light, (d) using a second lidar device to scan multiple planes, wherein pulses of laser light are emitted and reflected laser light is detected, (e) collecting data related to laser light emitted from the second lidar device and reflected laser light detected by the second lidar device, (f) determining whether reflected light from the second lidar device is the result of particulate matter in the air by comparing multiple planes for the presence and/or absence of reflected laser light
  • the multiple planes scanned by the first lidar device are in the same area or substantially the same area as the multiple planes scanned by the second lidar device.
  • the presence of a solid object can be confirmed when reflected laser light is detected in four or more of the multiple planes of the first lidar device and/or four or more of the multiple planes of the second lidar device.
  • the first lidar device and the second lidar device can use multi-echo scanning.
  • a first aspect of the invention is an improved system and method for detecting obstacles for autonomous driving using Iidar.
  • a second aspect of the invention is an improved system and method for autonomous driving that utilizes multiple Iidar systems.
  • a third aspect of the invention is an improved system and method for autonomous driving that utilizes multiple Iidar systems and an algorithm to avoid falsely identifying water or particles in the air as solid objects.
  • a fourth aspect of the invention is an improved system and method for autonomous driving that utilizes multi-echo technology, scans multiple layers and employs an algorithm to mitigate interference caused from rain or particles in the air.
  • a fifth aspect of the invention is an improved system and method for autonomous driving that utilizes dual Iidar systems, multi-echo technology and multiple layer scanning with an algorithm to improve the robustness and mitigate interference caused by rain or particles in the air.
  • FIG. 1A depicts a conventional multi-echo iidar system.
  • FIG. 1 B depicts a multi-layer scan lidar wherein each scan layer is superimposed on a grid map.
  • FIG. 2 depicts a grid obtained by a multi-layer scan lidar. A solid object will be present on multiple layers.
  • FIG. 3A is a flowchart of an embodiment wherein a lidar sensor uses last echo filtering and a rain filter algorithm for obstacle feature detection.
  • FIG. 3B depicts dual lidar units used to detect obstacles according to an embodiment of the invention.
  • FIG. 3C is a flowchart of an embodiment of the invention wherein dual lidar sensors use last echo filtering and a rain filter algorithm to distinguish return signals caused by rain or particles in the air from those of solid objects.
  • FIG. 4A depicts an autonomous vehicle with dual lidar sensors separated by a distance“d.”
  • FIG. 4B depicts dual lidar sensors and their respective zones of detection.
  • FIG. 5A is a plot of a single plane lidar.
  • FIG. 5B is a plot of a single plane lidar using the dual lidar system.
  • FIG. 5C is a plot of a multi-plane lidar.
  • FIG. 5D is a plot of a multi-plane lidar using the dual lidar system.
  • lidar While the invention is primarily described for the navigation of autonomous vehicles, it is understood that the invention is not so limited and can be used to assist with other endeavors that use lidar. Other applications include, for example, using the invention in other vehicles such as robots, drones or unmanned aircraft systems. The invention can also be used to improve the robustness of lidar in landscape imaging and mapping applications when particles such as rain or snow are present in the air.
  • references in this specification to "one embodiment/aspect” or “an embodiment/aspect” means that a particular feature, structure, or characteristic described in connection with the embodiment/aspect is included in at least one embodiment/aspect of the disclosure.
  • the use of the phrase “in one embodiment/aspect” or “in another embodiment/aspect” in various places in the specification are not necessarily all referring to the same embodiment/aspect, nor are separate or alternative embodiments/aspects mutually exclusive of other embodiments/aspects.
  • various features are described which may be exhibited by some embodiments/aspects and not by others.
  • various requirements are described which may be requirements for some embodiments/aspects but not other embodiments/aspects.
  • Embodiment and aspect can be in certain instances be used interchangeably.
  • multi-echo capability refers to the ability of a lidar unit to gather and evaluate multiple (e.g. three) echoes per transmitted laser pulse. Once an echo reaches the receiver of the unit, the received intensity is transformed into a voltage. An echo from a solid object will usually yield a high voltage over a long period of time. An echo of a rain drop, however, yields a very low voltage over a short period of time.
  • multi-layer technology refers to a lidar unit that allows for a pitch angle compensation by means of multiple (e.g. four) scan planes with different vertical angles.
  • the photo diode receiver of the unit include multiple (e.g. four) independent receivers arranged in a line. Each receivers scans a single plane, thus dividing the vertical aperture into multiple planes.
  • SLAM refers to Simultaneous Localization And Mapping which enables accurate mapping where GPS localization is unavailable, such as indoor spaces.
  • SLAM algorithms use LiDAR and IMU (Inertial Measurement Unit) data to simultaneously locate the sensor and generate a coherent map of its surroundings.
  • Time-of-Flight Principle refers to a method for measuring the distance between a sensor and an object, based on the time difference between the emission of a signal and its return to the sensor, after being reflected by an object.
  • Lidar is fundamentally a distance measuring technology.
  • a lidar system sends light energy to the ground or toward an object. This emitted light can be referred to as a “beam” or“pulse.”
  • the lidar unit measures light that is reflected back to the sensor. This reflected light can be referred to as the“echo” or“return.”
  • the spatial distance between the lidar system and a measured point is calculated by comparing the delay between the pulse and return.
  • Lidar is often a preferred system for use in autonomous vehicles because it can accurately map the three-dimensional surroundings of a vehicle to a high resolution.
  • rain or particulate matter in the air can render lidar ineffective as the particles reflect laser light back to the lidar unit.
  • an autonomous vehicle will slow or stop as the airborne particulates are indistinguishable from solid objects.
  • FIG. 1A depicts a lidar unit 110 using last echo filtering to detect an object 120.
  • the technique is also referred to as multi-echo scanning.
  • Rain 115 is depicted between the lidar unit 110 and an object 120.
  • One pulse can generate multiple echoes (i.e. first echo, intermediate echo, last echo) of laser light to the lidar unit. This can occur when particles in the air such as rain reflect laser light back to the lidar unit 110.
  • a first echo 130 is depicted wherein the signal is reflected from particles in the air. Intermediate echoes can also occur as a result of particles in the air.
  • a last echo 140 is depicted wherein the signal (i.e. echo) is reflected from a solid object 120.
  • the system can analyze echoes to identify those that are a result of rain or particles in the air. Using a“last echo filter,” the first and intermediate echoes are attributed to rain or particles in the air. The last echo is attributed to a solid object.
  • this technique has limitations and is ineffective in heavy rain or with substantial particulate matter in the air.
  • FIG. 1 B depicts multi-layer scanning, another technique aimed at identifying echoes or returns from rain 115 or particulate matter in the air.
  • a lidar unit 110 scans multiple angles.
  • a layer 130, plane or stacked plane can be attributed a scan of each angle.
  • the system can compare data from the layers to confirm the presence or absence of an object. An object that appears on a single layer will likely be attributed to a particle in the air. In a preferred method, the presence of a solid object is confirmed when echoes are detected in four or more adjacent layers.
  • Each scan angle/layer can be superimposed on a grid map as depicted in FIG. 2.
  • the results of a scan are represented on four layers. Of four scanned angles, each angle is depicted on a horizontal grid. Each shaded square indicates that the lidar unit detected an echo.
  • a conventional lidar system will indicate the presence of two objects, 215 and 225. Analysis of data from multiple layers can distinguish a false positive reading from a true reading.
  • the obstacles can be identified as a particle in the air such as rain.
  • the first object 215 is present in multiple layers, it is absent in an intermediate layer (Scan Layer 3). Further, its location varies among layers. In this circumstance, the system can conclude that the echoes are attributable to rain or particles in the air. In contrast, the second object 225 is present in multiple consecutive layers at the same location. With this data, the system can confirm that signals are attributable to a solid object. In a preferred method, an echo in four cells is used to confirm the presence of an object or obstacle. If the object is in three or fewer cells, it is attributed to rain or airborne particulates.
  • FIG. 3A is a flowchart that depicts a lidar unit that employs both last echo filtering and a rain filter algorithm.
  • the lidar unit can use a last echo filter 305 and a rain filter algorithm 315 that analyzes data from multi-layer scanning for obstacle feature and detection 325. This improves the reliability and robustness of lidar when airborne particles such as rain are present.
  • FIG. 3B depicts an embodiment of the invention wherein two lidar units function concurrently to improve the robustness of an autonomous vehicle navigation system and allow it to operate during rainy or snow conditions.
  • the system can include a first (i.e. left) lidar unit 210 and a second (i.e. right) lidar unit 220 that scan substantially the same area. Data can be collected from each lidar unit and compared using an algorithm to confirm that a signal is the result of a solid object.
  • the system confirms the presence of an object using the second lidar unit.
  • the system can use an algorithm to analyze data from each lidar unit which includes emitted light, reflected light (i.e. echoes) as well as the angles of reflection and movement of objects.
  • the second lidar unit confirms that reflected light received by the first lidar unit is the result of a solid object.
  • a reflected pulse (i.e. echo) that is detected by one of the units, without confirmation from the other, can be attributed to a particle in the air and classified as a false positive.
  • lidar units 210, 220
  • An echo that is detected by the left lidar unit 210 that is not detected by right lidar unit 220 can be attributed to a particle in the air such as rain.
  • An echo that is detected by the right lidar unit 220 that is not detected by left lidar unit 210 can be attributed to a particle in the air such as rain.
  • FIG. 3B depicts a first echo 130 along with a last echo 140.
  • FIG. 3C is a flowchart that depicts a system of two lidar units according to an embodiment of the invention.
  • the first lidar unit 210 and the second lidar unit 220 can use last echo filtering (310, 320) Data from each unit is collected, processed and analyzed using a rain filter algorithm 330.
  • the algorithm can, for example, attribute an echo to rain if it is detected by one lidar unit rather than the pair. This leads to more robust obstacle and feature detection 340.
  • An autonomous vehicle can avoid objects/hazards without slowing and/or stopping due to false positive signals from rain or particles in the air.
  • FIG. 4A depicts a preferred arrangement of the system wherein a first lidar unit 210 and a second lidar unit 220 are mounted on the roof of an automobile 410 and separated by a distance“d” from one another.
  • the distance“d” can be determined empirically and is preferably between 0.5 meters and 2.5 meters.
  • the lidar units can be separated by a distance according to variables such as the vehicle size, resolution of the lidar units, anticipated distance of objects/hazards and the amount of particulate matter in the air.
  • the ideal distance can vary from, for example, 1 cm for a small mobile robot to several meters for a large drone.
  • first lidar unit 210 and a second lidar unit 220 can be mounted at other external positions on the vehicle (e.g. the bumper of hood).
  • the lidar units are position on a horizontal plane with one another so that laser beams are emitted on the same planes.
  • FIG. 4B depicts the zones of detection 400 of a first (i.e. left) lidar unit 210 and a second (i.e. right) lidar unit 220.
  • Each system scans a substantially circular area to detect an object 120. The scans overlap in the center area 235.
  • the number and direction of the rays emitted by each sensor can vary based on the lidar unit and the setting.
  • Rain or particles in the air can reflect laser light to the lidar units.
  • a solid object 120 is detected from laser light reflected to both lidar units.
  • the two lidar units function concurrently and echoes are analyzed Multi-uni technology can be combined with last echo filtering techniques and multi-layer scanning to improve the robustness of autonomous vehicle navigation and allow operation during conditions where particles such as rain or snow are present in the air.
  • Lidar units were affixed to the roof of a vehicle at a distance of approximately one meter from each other.
  • Two types of lidar were used: a single plane (LMS151 unit) and a four-plane (LDMRS unit).
  • the lidar units used multi-echo technology.
  • FIG. 5A - 5D are top view representations of the environment with an autonomous vehicle at the center.
  • the horizontal axis represents the distance away from the front and rear of the vehicle in meters.
  • the vertical axis represents the distance away from the sides of the vehicle in meters. Each segment depicts a distance of 5 meters.
  • FIG. 5A is a plot of a single plane lidar using multi-echo scanning.
  • the circled areas are known locations of solid objects (pillars in the rain tunnel).
  • the solid objects are detected along with additional areas from false positives from rain drops 505. Most of the false-positives are detected in the area in front of the vehicle 505.
  • FIG. 5B is a plot of a single plane lidar with data from the dual lidar system. Only scan points detected by both the first lidar system and the second lidar system are included.
  • the horizontal axis represents the distance away from the front and rear of the vehicle.
  • the vertical axis represents the distance away from the sides of the vehicle.
  • Return signals attributable to rain can be identified by comparing FIG. 5A and FIG. 5B. Without the dual lidar system, false positives are present and attributed to echoes from rain drops in the air 505. These false positives are absent with the use of the dual lidar system.
  • the circled areas are structural objects of (e.g. pillars) that are present and detected in both tests.
  • FIG. 5C and FIG. 5D demonstrate the use of a multi-plane lidar filter in detecting false positive echoes.
  • FIG. 5C is a plot of a multi-plane lidar (unfiltered). The horizontal axis represents the distance away from the front and rear of the vehicle. The vertical axis represents the distance away from the sides of the vehicle. Each segment depicts a distance of 5 meters. Points along the horizontal axis indicate the detection of false positives due to rain.
  • FIG. 5D is a plot of a multi-plane lidar that uses a dual lidar system. Many of the plotted points are absent with the use of a dual lidar filter. These points are attributed to rain drops in the air. The points that are present in both FIG. 5C and FIG. 5D can be attributed to physical objects (i.e. features).
  • Additional modifications/components can improve the robustness of the system in heavy rain (e.g. rain of more than 20 mm/hour).
  • One approach is to use lidar units capable of analyzing additional layers (e.g. five or more layers) for better penetration in rain.
  • Lidar units with higher refresh rates can also be used to filter out rain droplets.
  • a mechanical air blower can be implemented to provide an air curtain with a range (e.g. up to one meter) to reduce the amount of rain drop being detected by the lidar units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention relates to a system and method for using lidar to navigate an autonomous vehicle during rain or snowy conditions. Two lidar units are used to concurrently scan an area near an autonomous vehicle. The reflected laser light from the two lidar devices is analyzed with an algorithm. Reflected laser light that is not detected by both units can be attributed to rain. The lidar units can use multi-echo scanning. Further, the lidar devices can scan a series of planes (i.e. multi-plane scanning) to improve the robustness of the system. The system improves the reliability of lidar navigation when rain, snow or other particles are present in the air.

Description

Rain Filtering Techniques for Autonomous Vehicle
TECHNICAL FIELD
[0001] The present invention relates to a system and method of using lidar for autonomous vehicle navigation when particulate matter is present in the air such as rain or snow.
BACKGROUND
[0002] An autonomous car is one that is capable of monitoring its environment and navigating without human input. Benefits of autonomous cars include greater vehicle safety, greater efficiency, lower costs, decreased congestion and greater mobility for those who are unfit or not licensed to driver.
[0003] Autonomous cars use sensors to monitor their location and their environment. CMOS cameras, radar and laser light can be used to detect objects and hazards in their surroundings. However, each type of sensor has limitations. For example, CMOS cameras rely on adequate lighting to detect objects. Moreover, the cameras can be ineffective when light is projected back by, for example, headlights of an approaching vehicle. Other shortcomings of cameras include inadequate lighting, motion blurring, limited field-of-view and/or detection range. Radar does not rely on visible light but lacks the resolution needed to identify smaller objects. Laser light can be used to overcome these limitations but can be ineffective during rain or snow.
[0004] Lidar (Light Imaging Detection and Ranging) measures the distance to an object by illuminating it with a pulsed laser light and measuring the reflected pulses with a sensor. In typical use, a lidar system reflects multiple laser pulses off of objects surrounding the vehicle. Differences in laser return times and wavelengths can then be used to make digital 3D representations of objects.
[0005] However, it is difficult for lidar to transmit through snow, rain, fog and dust in the air. Rain or other particles in the air can reflect laser light back to the system in the same manner as a solid object. As pulsed laser light is reflected back to a lidar system, an autonomous vehicle will unnecessarily slow or stop. This is due to the“false positive” detection of obstacles.
[0006] Methods have been proposed to improve the robustness of lidar during rain or snowy conditions. For example, lidar sensors with “multi-echo” or “last echo” technology can filter reflections caused by dust, rain and fog. They operate under the premise that a portion of the energy from a pulse may be reflected by particle matter. The remainder of the beam can continue to propagate and is reflected by a solid object. When this occurs, the lidar unit can ignore the closer, weaker reflections caused by rain or particulates in the air. While an improvement, this technique is not robust for filtering moderate to heavy rain.
[0007] U.S. Patent No. 9,097,800 describes a method of combining radar with lidar to detect objects. The lidar is used to create a three-dimensional point map in an area surrounding the vehicle. Radar is then used to confirm that objects or hazards are solid materials rather than rain or airborne particles. While the system is designed to prevent false positive lidar detections, it requires the use of a separate radar system. Further, radar has inherently low resolution and can miss objects with weak reflectivity.
[0008] U.S. Patent Application No. 14/576,265 also describes a method of using lidar to confirm the presence of objects near an autonomous vehicle. The lidar detects multiple points of an object (i.e. right, left, top and bottom) to confirm its presence. The approach is similar to multi-echo methods as it requires multiple pulses. As with U.S. Patent No. 9,097,800, it may not be effective in detecting small objects.
[0009] Accordingly, there is a need for system that can allow autonomous vehicles to navigate using lidar in rain or snow.
SUMMARY OF THE INVENTION
[0010] The invention recognizes that there is a need for an improved system for autonomous vehicles to navigate and detect objects and hazards. The system should be reliable and capable of operating in all weather conditions. The invention includes a system and rain filtering algorithm to allow autonomous vehicles to operate in rain or snow.
[0011] The following summary is provided to facilitate an understanding of some of the innovative features unique to the disclosed embodiments and is not intended to be a full description. A full appreciation of the various aspects of the embodiments disclosed herein can be gained by taking into consideration the entire specification, claims, drawings, and abstract as a whole.
[0012] The invention includes a system for detecting object and/or navigating using laser light (i.e. lidar) as well as distinguishing solid objects from particles in the air. The system can include a first lidar device that scans an area by emitting pulses of laser light and detecting laser light that is reflected and a second lidar device that scans the same or substantially the same area by emitting pulses of laser light and detecting laser light that is reflected. A computer can use an algorithm to compare emitted laser light and reflected laser light detected by the first lidar device with emitted laser light and reflected laser light detected by the second lidar device to confirm the presence of a solid object. The presence of a solid object can be confirmed when the first lidar device and the second lidar device detect reflected laser light from the solid object. The first and second lidar devices can be mounted on an autonomous vehicle so that they are 0.5 to 2.5 meters apart from each other on a horizontal plane.
[0013] In addition to dual-lidar unit technology, the first lidar device and the second lidar device can utilize multi-echo technology to distinguish solid objects from airborne particles. Furthermore, the lidar devices can perform multiple plane scanning to distinguish solid objects from airborne particles.
[0014] The invention also includes a method of detecting objects or obstacles for autonomous driving. A first lidar device scans an area near an autonomous vehicle by emitting laser light and detecting reflected laser light (i.e. echoes). A second lidar device scans an area near an autonomous vehicle by emitting laser light and detecting echoes. Emitted pulses and echoes detected by the first lidar device are compared with emitted pulses and echoes detected by the second lidar device using a computer and an algorithm to detect and confirm the presence of an object or obstacle. The presence of a solid object can be confirmed when both the first lidar unit and the second lidar unit detect echoes from the object.
[0015] An echo that results from rain or snow can be identified by comparing echoes received by the first lidar device with echoes received by the second lidar device. The first lidar device and the second lidar device can use multi-echo scanning. Further, the lidar devices can scan a series of layers or planes. The presence of a solid object can be confirmed when the solid object appears in one or more layers or planes.
[0016] The invention also includes a method of detecting solid objects comprised of the steps of (a) scanning an area near an autonomous vehicle by emitting pulses of laser light and detecting echoes with a first lidar unit, (b) scanning substantially the same area near by emitting pulses of laser light and detecting echoes with a second lidar unit and (c) using an algorithm to compare pulses emitted and echoes received by the first lidar unit with pulses emitted and echoes received by the second lidar unit to detect and confirm the presence of an object or obstacle. A false positive (i.e. particulate matter in the air such as rain or snow) can be identified when one of the lidar units detects an echo but the other does not.
[0017] Further, the first lidar unit and the second lidar unit can use multi-echo scanning. They can also scan a series of planes. The presence of a solid object can be confirmed when the object appears in more than one plane. Furthermore, the presence of a solid object can be confirmed in the last echo of a multi-echo scan.
[0018] The invention also includes a method for detecting and/or confirming the presence of a solid object using lidar comprised of the steps of (a) using a first lidar device to scan multiple planes, wherein pulses of laser light are emitted and reflected laser light is detected, (b) collecting data related to laser light emitted from the first lidar device and reflected laser light detected by the first lidar device, (c) determining whether reflected light from the first lidar device is the result of particulate matter in the air by comparing multiple planes for the presence and/or absence of reflected light, (d) using a second lidar device to scan multiple planes, wherein pulses of laser light are emitted and reflected laser light is detected, (e) collecting data related to laser light emitted from the second lidar device and reflected laser light detected by the second lidar device, (f) determining whether reflected light from the second lidar device is the result of particulate matter in the air by comparing multiple planes for the presence and/or absence of reflected laser light and (g) using an algorithm to compare data from the first lidar device with data from the second lidar device to confirm that reflected light is the result of a solid object rather than particulate matter in the air. The multiple planes scanned by the first lidar device are in the same area or substantially the same area as the multiple planes scanned by the second lidar device. The presence of a solid object can be confirmed when reflected laser light is detected in four or more of the multiple planes of the first lidar device and/or four or more of the multiple planes of the second lidar device. Further, the first lidar device and the second lidar device can use multi-echo scanning.
INTRODUCTION
[0019] A first aspect of the invention is an improved system and method for detecting obstacles for autonomous driving using Iidar.
[0020] A second aspect of the invention is an improved system and method for autonomous driving that utilizes multiple Iidar systems.
[0021] A third aspect of the invention is an improved system and method for autonomous driving that utilizes multiple Iidar systems and an algorithm to avoid falsely identifying water or particles in the air as solid objects.
[0022] A fourth aspect of the invention is an improved system and method for autonomous driving that utilizes multi-echo technology, scans multiple layers and employs an algorithm to mitigate interference caused from rain or particles in the air.
[0023] A fifth aspect of the invention is an improved system and method for autonomous driving that utilizes dual Iidar systems, multi-echo technology and multiple layer scanning with an algorithm to improve the robustness and mitigate interference caused by rain or particles in the air.
Brief Description of Figures
[0024] The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
[0025] The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, the drawings are not to scale.
[0026] FIG. 1A depicts a conventional multi-echo iidar system. [0027] FIG. 1 B depicts a multi-layer scan lidar wherein each scan layer is superimposed on a grid map.
[0028] FIG. 2 depicts a grid obtained by a multi-layer scan lidar. A solid object will be present on multiple layers.
[0029] FIG. 3A is a flowchart of an embodiment wherein a lidar sensor uses last echo filtering and a rain filter algorithm for obstacle feature detection.
[0030] FIG. 3B depicts dual lidar units used to detect obstacles according to an embodiment of the invention.
[0031] FIG. 3C is a flowchart of an embodiment of the invention wherein dual lidar sensors use last echo filtering and a rain filter algorithm to distinguish return signals caused by rain or particles in the air from those of solid objects.
[0032] FIG. 4A depicts an autonomous vehicle with dual lidar sensors separated by a distance“d.”
[0033] FIG. 4B depicts dual lidar sensors and their respective zones of detection.
[0034] FIG. 5A is a plot of a single plane lidar.
[0035] FIG. 5B is a plot of a single plane lidar using the dual lidar system.
[0036] FIG. 5C is a plot of a multi-plane lidar.
[0037] FIG. 5D is a plot of a multi-plane lidar using the dual lidar system.
Figure imgf000009_0001
DETAILED DESCRIPTION OF THE INVENTION
Definitions
[0038] While the invention is primarily described for the navigation of autonomous vehicles, it is understood that the invention is not so limited and can be used to assist with other endeavors that use lidar. Other applications include, for example, using the invention in other vehicles such as robots, drones or unmanned aircraft systems. The invention can also be used to improve the robustness of lidar in landscape imaging and mapping applications when particles such as rain or snow are present in the air.
[0039] Reference in this specification to "one embodiment/aspect" or "an embodiment/aspect" means that a particular feature, structure, or characteristic described in connection with the embodiment/aspect is included in at least one embodiment/aspect of the disclosure. The use of the phrase "in one embodiment/aspect" or "in another embodiment/aspect" in various places in the specification are not necessarily all referring to the same embodiment/aspect, nor are separate or alternative embodiments/aspects mutually exclusive of other embodiments/aspects. Moreover, various features are described which may be exhibited by some embodiments/aspects and not by others. Similarly, various requirements are described which may be requirements for some embodiments/aspects but not other embodiments/aspects. Embodiment and aspect can be in certain instances be used interchangeably.
[0040] The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. Certain terms that are used to describe the disclosure are discussed below, or elsewhere in the specification, to provide additional guidance to the practitioner regarding the description of the disclosure. For convenience, certain terms may be highlighted, for example using italics and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that the same thing can be said in more than one way.
[0041] Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein. Nor is any special significance to be placed upon whether or not a term is elaborated or discussed herein. Synonyms for certain terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification including examples of any terms discussed herein is illustrative only, and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.
[0042] Without intent to further limit the scope of the disclosure, examples of instruments, apparatus, methods and their related results according to the embodiments of the present disclosure are given below. Note that titles or subtitles may be used in the examples for convenience of a reader, which in no way should limit the scope of the disclosure. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In the case of conflict, the present document, including definitions, will control.
[0043] Directional and/or relational terms such as, but not limited to, left, right, nadir, apex, top, bottom, vertical, horizontal, back, front, and lateral are relative to each other, are dependent on the specific orientation of an applicable element or article, are used accordingly to aid in the description of the various embodiments in this specification and the appended claims, and are not necessarily intended to be construed as limiting.
[0044] - As applicable— the terms about or generallyT^as used herein in the specification and appended claims, and unless otherwise indicated, means a margin of +/- 20%. Also, as applicable, the term "substantially" as used herein in the specification and appended claims, unless otherwise indicated, means a margin of +/- 10%. It is to be appreciated that not all uses of the above terms are quantifiable such that the referenced ranges can be applied.
[0045] The term“multi-echo capability” refers to the ability of a lidar unit to gather and evaluate multiple (e.g. three) echoes per transmitted laser pulse. Once an echo reaches the receiver of the unit, the received intensity is transformed into a voltage. An echo from a solid object will usually yield a high voltage over a long period of time. An echo of a rain drop, however, yields a very low voltage over a short period of time.
[0046] The term“multi-layer technology” refers to a lidar unit that allows for a pitch angle compensation by means of multiple (e.g. four) scan planes with different vertical angles. In a preferred design, the photo diode receiver of the unit include multiple (e.g. four) independent receivers arranged in a line. Each receivers scans a single plane, thus dividing the vertical aperture into multiple planes.
[0047] The term“SLAM” refers to Simultaneous Localization And Mapping which enables accurate mapping where GPS localization is unavailable, such as indoor spaces. SLAM algorithms use LiDAR and IMU (Inertial Measurement Unit) data to simultaneously locate the sensor and generate a coherent map of its surroundings.
[0048] The term “Time-of-Flight Principle” refers to a method for measuring the distance between a sensor and an object, based on the time difference between the emission of a signal and its return to the sensor, after being reflected by an object.
[0049] Other technical terms used herein have their ordinary meaning in the art that they are used, as exemplified by a variety of technical dictionaries. The particular values and configurations discussed in these non-limiting examples can be varied and are cited merely to illustrate at least one embodiment and are not intended to limit the scope thereof. Description of Preferred Embodiments
[0050] Lidar is fundamentally a distance measuring technology. A lidar system sends light energy to the ground or toward an object. This emitted light can be referred to as a “beam” or“pulse.” The lidar unit measures light that is reflected back to the sensor. This reflected light can be referred to as the“echo” or“return.” The spatial distance between the lidar system and a measured point is calculated by comparing the delay between the pulse and return.
[0051] Lidar is often a preferred system for use in autonomous vehicles because it can accurately map the three-dimensional surroundings of a vehicle to a high resolution. However, rain or particulate matter in the air can render lidar ineffective as the particles reflect laser light back to the lidar unit. As light is reflected, an autonomous vehicle will slow or stop as the airborne particulates are indistinguishable from solid objects.
[0052] FIG. 1A depicts a lidar unit 110 using last echo filtering to detect an object 120. The technique is also referred to as multi-echo scanning. Rain 115 is depicted between the lidar unit 110 and an object 120. One pulse can generate multiple echoes (i.e. first echo, intermediate echo, last echo) of laser light to the lidar unit. This can occur when particles in the air such as rain reflect laser light back to the lidar unit 110. A first echo 130 is depicted wherein the signal is reflected from particles in the air. Intermediate echoes can also occur as a result of particles in the air. A last echo 140 is depicted wherein the signal (i.e. echo) is reflected from a solid object 120.
[0053] The system can analyze echoes to identify those that are a result of rain or particles in the air. Using a“last echo filter,” the first and intermediate echoes are attributed to rain or particles in the air. The last echo is attributed to a solid object. However, this technique has limitations and is ineffective in heavy rain or with substantial particulate matter in the air.
[0054] FIG. 1 B depicts multi-layer scanning, another technique aimed at identifying echoes or returns from rain 115 or particulate matter in the air. A lidar unit 110 scans multiple angles. A layer 130, plane or stacked plane can be attributed a scan of each angle. The system can compare data from the layers to confirm the presence or absence of an object. An object that appears on a single layer will likely be attributed to a particle in the air. In a preferred method, the presence of a solid object is confirmed when echoes are detected in four or more adjacent layers.
[0055] Each scan angle/layer can be superimposed on a grid map as depicted in FIG. 2. The results of a scan are represented on four layers. Of four scanned angles, each angle is depicted on a horizontal grid. Each shaded square indicates that the lidar unit detected an echo. A conventional lidar system will indicate the presence of two objects, 215 and 225. Analysis of data from multiple layers can distinguish a false positive reading from a true reading.
[0056] If obstacles are detected in scan layer 1 but not in scan layer 2 or N at the exact position, then the obstacles can be identified as a particle in the air such as rain. Although the first object 215 is present in multiple layers, it is absent in an intermediate layer (Scan Layer 3). Further, its location varies among layers. In this circumstance, the system can conclude that the echoes are attributable to rain or particles in the air. In contrast, the second object 225 is present in multiple consecutive layers at the same location. With this data, the system can confirm that signals are attributable to a solid object. In a preferred method, an echo in four cells is used to confirm the presence of an object or obstacle. If the object is in three or fewer cells, it is attributed to rain or airborne particulates.
[0057] FIG. 3A is a flowchart that depicts a lidar unit that employs both last echo filtering and a rain filter algorithm. The lidar unit can use a last echo filter 305 and a rain filter algorithm 315 that analyzes data from multi-layer scanning for obstacle feature and detection 325. This improves the reliability and robustness of lidar when airborne particles such as rain are present. [0058] FIG. 3B depicts an embodiment of the invention wherein two lidar units function concurrently to improve the robustness of an autonomous vehicle navigation system and allow it to operate during rainy or snow conditions. The system can include a first (i.e. left) lidar unit 210 and a second (i.e. right) lidar unit 220 that scan substantially the same area. Data can be collected from each lidar unit and compared using an algorithm to confirm that a signal is the result of a solid object.
[0059] If the first lidar unit receives reflected light from an object, the system confirms the presence of an object using the second lidar unit. The system can use an algorithm to analyze data from each lidar unit which includes emitted light, reflected light (i.e. echoes) as well as the angles of reflection and movement of objects. In this manner, the second lidar unit confirms that reflected light received by the first lidar unit is the result of a solid object. A reflected pulse (i.e. echo) that is detected by one of the units, without confirmation from the other, can be attributed to a particle in the air and classified as a false positive. The following principles can be applied to this method:
• Solid objects/obstacles are detected by both lidar units (210, 220).
• An echo that is detected by the left lidar unit 210 that is not detected by right lidar unit 220 can be attributed to a particle in the air such as rain.
• An echo that is detected by the right lidar unit 220 that is not detected by left lidar unit 210 can be attributed to a particle in the air such as rain.
[0060] This dual-unit approach can be combined with other filter techniques to improve its robustness and reliability such as a last echo filter and/or multi-layer scanning. FIG. 3B depicts a first echo 130 along with a last echo 140.
[0061] FIG. 3C is a flowchart that depicts a system of two lidar units according to an embodiment of the invention. The first lidar unit 210 and the second lidar unit 220 can use last echo filtering (310, 320) Data from each unit is collected, processed and analyzed using a rain filter algorithm 330. The algorithm can, for example, attribute an echo to rain if it is detected by one lidar unit rather than the pair. This leads to more robust obstacle and feature detection 340. An autonomous vehicle can avoid objects/hazards without slowing and/or stopping due to false positive signals from rain or particles in the air.
[0062] FIG. 4A depicts a preferred arrangement of the system wherein a first lidar unit 210 and a second lidar unit 220 are mounted on the roof of an automobile 410 and separated by a distance“d” from one another. The distance“d” can be determined empirically and is preferably between 0.5 meters and 2.5 meters. However, the lidar units can be separated by a distance according to variables such as the vehicle size, resolution of the lidar units, anticipated distance of objects/hazards and the amount of particulate matter in the air. The ideal distance can vary from, for example, 1 cm for a small mobile robot to several meters for a large drone. As it may be appreciated, the first lidar unit 210 and a second lidar unit 220 can be mounted at other external positions on the vehicle (e.g. the bumper of hood). In a preferred arrangement, the lidar units are position on a horizontal plane with one another so that laser beams are emitted on the same planes.
[0063] FIG. 4B depicts the zones of detection 400 of a first (i.e. left) lidar unit 210 and a second (i.e. right) lidar unit 220. Each system scans a substantially circular area to detect an object 120. The scans overlap in the center area 235. The number and direction of the rays emitted by each sensor can vary based on the lidar unit and the setting.
[0064] While the units scan substantially the same area, data collected from the area of overlap can be analyzed with an algorithm to confirm that a signal is the result of a solid object. Rain or particles in the air can reflect laser light to the lidar units. A solid object 120 is detected from laser light reflected to both lidar units.
[0065] The two lidar units function concurrently and echoes are analyzed Multi-uni technology can be combined with last echo filtering techniques and multi-layer scanning to improve the robustness of autonomous vehicle navigation and allow operation during conditions where particles such as rain or snow are present in the air.
WORKING EXAMPLE
Use of Lidar for Vehicle Navigation in Rain
[0066] The system was tested with an autonomous vehicle in a rain tunnel with a steady flow of rain of 10 millimeters (mm) per hour. Lidar units were affixed to the roof of a vehicle at a distance of approximately one meter from each other. Two types of lidar were used: a single plane (LMS151 unit) and a four-plane (LDMRS unit). The lidar units used multi-echo technology.
[0067] FIG. 5A - 5D are top view representations of the environment with an autonomous vehicle at the center. The horizontal axis represents the distance away from the front and rear of the vehicle in meters. The vertical axis represents the distance away from the sides of the vehicle in meters. Each segment depicts a distance of 5 meters.
[0068] FIG. 5A is a plot of a single plane lidar using multi-echo scanning. The circled areas are known locations of solid objects (pillars in the rain tunnel). The solid objects are detected along with additional areas from false positives from rain drops 505. Most of the false-positives are detected in the area in front of the vehicle 505.
[0069] The test was repeated using a (single plane) dual lidar system wherein a first lidar system and a second lidar system scan the area. FIG. 5B is a plot of a single plane lidar with data from the dual lidar system. Only scan points detected by both the first lidar system and the second lidar system are included. As in FIG. 5A, the horizontal axis represents the distance away from the front and rear of the vehicle. The vertical axis represents the distance away from the sides of the vehicle.
[0070] Return signals attributable to rain can be identified by comparing FIG. 5A and FIG. 5B. Without the dual lidar system, false positives are present and attributed to
Figure imgf000017_0001
echoes from rain drops in the air 505. These false positives are absent with the use of the dual lidar system. The circled areas are structural objects of (e.g. pillars) that are present and detected in both tests.
[0071] FIG. 5C and FIG. 5D demonstrate the use of a multi-plane lidar filter in detecting false positive echoes. FIG. 5C is a plot of a multi-plane lidar (unfiltered). The horizontal axis represents the distance away from the front and rear of the vehicle. The vertical axis represents the distance away from the sides of the vehicle. Each segment depicts a distance of 5 meters. Points along the horizontal axis indicate the detection of false positives due to rain.
[0072] The test was repeated using a (multi-plane) dual system wherein a first lidar system and a second lidar system scan the area. FIG. 5D is a plot of a multi-plane lidar that uses a dual lidar system. Many of the plotted points are absent with the use of a dual lidar filter. These points are attributed to rain drops in the air. The points that are present in both FIG. 5C and FIG. 5D can be attributed to physical objects (i.e. features).
[0073] Additional modifications/components can improve the robustness of the system in heavy rain (e.g. rain of more than 20 mm/hour). One approach is to use lidar units capable of analyzing additional layers (e.g. five or more layers) for better penetration in rain. Lidar units with higher refresh rates can also be used to filter out rain droplets. A mechanical air blower can be implemented to provide an air curtain with a range (e.g. up to one meter) to reduce the amount of rain drop being detected by the lidar units.
[0074] It will be appreciated that variations of the above disclosed and other features and functions, or alternatives thereof, may be combined into other systems or applications. Also, various unforeseen or unanticipated alternatives, modifications, variations or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims. [0075] Although embodiments of the current disclosure have been described comprehensively, in considerable detail to cover the possible aspects, those skilled in the art would recognize that other versions of the disclosure are also possible.

Claims

What is claimed is:
1. A system for distinguishing solid objects from particles in the air using laser light, said system comprised of:
a first lidar device that scans an area by emitting pulses of laser light and detecting laser light that is reflected;
a second lidar device that scans the same or substantially the same area as the first lidar device by emitting pulses of laser light and detecting laser light that is reflected; a computer that compares emitted pulses and reflected laser light detected by the first lidar device with emitted pulses and reflected laser light detected by the second lidar device;
wherein the presence of a solid object is confirmed when the first lidar device and the second lidar device detect reflected laser light from the solid object.
2. The system of claim 1 , wherein the first lidar device and the second lidar device utilize multi-echo technology to distinguish solid objects from particles in the air.
3. The system of claim 1 , wherein the first lidar device and the second lidar device scan multiple planes and compare echoes from one or more planes to confirm that one or more echoes is reflected from a solid object rather than particles in the air.
4. The system of claim 1 , wherein the first lidar device and the second lidar device are positioned on a horizontal plane at a distance of 0.5 to 2.5 meters from each other.
5. A method of distinguishing solid objects from particles in the air comprised of the steps of:
scanning an area near an autonomous vehicle by emitting pulses of laser light and detecting echoes with a first lidar unit;
scanning substantially the same area by emitting pulses of laser light and detecting echoes with a second lidar unit;
using an algorithm to compare pulses emitted and echoes received by the first lidar unit with pulses emitted and echoes received by the second lidar unit; and
confirming the presence of a solid object when the first lidar unit and the second lidar unit detect echoes from the solid object.
6. The method of claim 5, wherein the first lidar unit and the second lidar unit use multi-echo scanning.
7. The method of claim 5, wherein the first lidar unit and the second lidar unit scan multiple planes and compare echoes from one or more planes to distinguish solid objects from airborne particles.
8. The method of claim 7, wherein the presence of a solid object is confirmed when echoes are detected in more than one plane.
9. The method of claim 5, wherein the first lidar unit and the second lidar unit are positioned at a distance of 0.5 to 2.5 meters from each other on a horizontal plane.
10. A method of distinguishing solid objects or obstacles from particles present in the air comprising the steps of:
a) using a first lidar device to scan an area by emitting pulses of laser light and detecting reflected laser light;
b) using a second lidar device to scan substantially the same area by emitting pulses of laser light and detecting reflected laser light;
c) collecting data related to emitted laser light and reflected laser light from the first lidar device;
d) collecting data related to emitted laser light and reflected laser light from the second lidar device; and
e) determining that reflected laser light is the result of particles in the air when one of the first lidar device and the second lidar device detects reflected laser light that is not detected by both the the first lidar device and the second lidar device.
1 1. The method of claim 10, wherein the first lidar device and the second lidar device use multi-echo scanning.
12. The method of claim 10, wherein the first lidar device and the second lidar device scan multiple planes and compare echoes from one or more planes to distinguish solid objects from particles in the air.
13. The method of claim 12, wherein the presence of a solid object is confirmed when echoes are detected in more than one plane.
14. A method for confirming the presence of a solid object using lidar comprised of the steps of:
a) using a first lidar device to scan multiple planes, wherein pulses of laser light are emitted and reflected laser light is detected;
b) collecting data related to laser light emitted from the first lidar device and reflected laser light detected by the first lidar device;
c) determining whether reflected light from the first lidar device is the result of particles in the air by comparing multiple planes for the presence and/or absence of reflected light;
d) using a second lidar device to scan multiple planes, wherein pulses of laser light are emitted and reflected laser light is detected;
e) collecting data related to laser light emitted from the second lidar device and reflected laser light detected by the second lidar device;
f) determining whether reflected laser light from the second lidar device is the result of particles in the air by comparing multiple planes for the presence and/or absence of reflected laser light; and g) using an algorithm to compare data from the first lidar device with data from the second lidar device;
wherein the multiple planes scanned by the first lidar device are in the same area or substantially the same area as the multiple planes scanned by the second lidar device; and
wherein the presence of a solid object is confirmed when the first lidar device and the second lidar device detect reflected laser light in multiple planes from the solid object.
15. The method of claim 14, wherein the presence of a solid object is confirmed when reflected laser light is detected in four or more of the multiple planes of the first lidar device and/or four or more of the multiple planes of the second lidar device.
16. The method of claim 14, wherein the first lidar device and the second lidar device use multi-echo scanning.
17. The method of claim 14, wherein the first lidar device and the second lidar device are positioned at a distance of 0.5 to 2.5 meters from each other on a horizontal plane.
PCT/SG2017/050607 2017-12-08 2017-12-08 Rain filtering techniques for autonomous vehicle WO2019112514A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2020528321A JP2021514457A (en) 2017-12-08 2017-12-08 Rain filtering techniques for autonomous vehicles
PCT/SG2017/050607 WO2019112514A1 (en) 2017-12-08 2017-12-08 Rain filtering techniques for autonomous vehicle
AU2017442202A AU2017442202A1 (en) 2017-12-08 2017-12-08 Rain filtering techniques for autonomous vehicle
SG11202005246TA SG11202005246TA (en) 2017-12-08 2017-12-08 Rain filtering techniques for autonomous vehicle
TW107144162A TW201932868A (en) 2017-12-08 2018-12-07 Rain filtering techniques for autonomous vehicle
IL275162A IL275162A (en) 2017-12-08 2020-06-07 Rain filtering techniques for autonomous vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SG2017/050607 WO2019112514A1 (en) 2017-12-08 2017-12-08 Rain filtering techniques for autonomous vehicle

Publications (1)

Publication Number Publication Date
WO2019112514A1 true WO2019112514A1 (en) 2019-06-13

Family

ID=66751135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2017/050607 WO2019112514A1 (en) 2017-12-08 2017-12-08 Rain filtering techniques for autonomous vehicle

Country Status (6)

Country Link
JP (1) JP2021514457A (en)
AU (1) AU2017442202A1 (en)
IL (1) IL275162A (en)
SG (1) SG11202005246TA (en)
TW (1) TW201932868A (en)
WO (1) WO2019112514A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112859010A (en) * 2019-11-28 2021-05-28 怡利电子工业股份有限公司 Alarming method for preventing raindrop misinformation of millimeter wave radar
TWI732365B (en) * 2019-11-29 2021-07-01 怡利電子工業股份有限公司 Warning method for preventing raindrop false alarm of millimeter wave radar
CN115825982A (en) * 2023-02-02 2023-03-21 深圳煜炜光学科技有限公司 Method and system for scanning point cloud data of unmanned aerial vehicle in rainy environment
US11640170B1 (en) 2019-10-29 2023-05-02 Zoox, Inc. Identification of particulate matter in sensor data
US11643072B2 (en) * 2019-09-27 2023-05-09 Zoox, Inc. Planning accommodations for particulate matter
EP4130798A4 (en) * 2020-04-15 2023-05-31 Huawei Technologies Co., Ltd. Target identification method and device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024052332A (en) * 2022-09-30 2024-04-11 株式会社小松製作所 Work site detection system and work site detection method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5314037A (en) * 1993-01-22 1994-05-24 Shaw David C H Automobile collision avoidance system
JP2013167479A (en) * 2012-02-14 2013-08-29 Toyota Motor Corp Laser radar device and object detection method using the same
US20150009485A1 (en) * 2013-07-02 2015-01-08 Electronics And Telecommunications Research Institute Laser radar system
CN204882872U (en) * 2015-09-09 2015-12-16 厦门理工学院 To keeping away barrier system before car
CN105607075A (en) * 2015-09-08 2016-05-25 北京铁路局北京科学技术研究所 Road safety monitoring method and apparatus thereof
CA2980305A1 (en) * 2015-03-25 2016-09-29 Waymo Llc Vehicle with multiple light detection and ranging devices (lidars)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009069020A (en) * 2007-09-13 2009-04-02 Toshiba Corp Radar device
JP5092076B2 (en) * 2007-10-26 2012-12-05 オプテックス株式会社 Laser area sensor
JP2016191617A (en) * 2015-03-31 2016-11-10 シャープ株式会社 Obstacle determination device
JP6598517B2 (en) * 2015-06-04 2019-10-30 東日本高速道路株式会社 Reverse running vehicle detection system using light waves

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5314037A (en) * 1993-01-22 1994-05-24 Shaw David C H Automobile collision avoidance system
JP2013167479A (en) * 2012-02-14 2013-08-29 Toyota Motor Corp Laser radar device and object detection method using the same
US20150009485A1 (en) * 2013-07-02 2015-01-08 Electronics And Telecommunications Research Institute Laser radar system
CA2980305A1 (en) * 2015-03-25 2016-09-29 Waymo Llc Vehicle with multiple light detection and ranging devices (lidars)
CN105607075A (en) * 2015-09-08 2016-05-25 北京铁路局北京科学技术研究所 Road safety monitoring method and apparatus thereof
CN204882872U (en) * 2015-09-09 2015-12-16 厦门理工学院 To keeping away barrier system before car

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11643072B2 (en) * 2019-09-27 2023-05-09 Zoox, Inc. Planning accommodations for particulate matter
US11640170B1 (en) 2019-10-29 2023-05-02 Zoox, Inc. Identification of particulate matter in sensor data
CN112859010A (en) * 2019-11-28 2021-05-28 怡利电子工业股份有限公司 Alarming method for preventing raindrop misinformation of millimeter wave radar
TWI732365B (en) * 2019-11-29 2021-07-01 怡利電子工業股份有限公司 Warning method for preventing raindrop false alarm of millimeter wave radar
EP4130798A4 (en) * 2020-04-15 2023-05-31 Huawei Technologies Co., Ltd. Target identification method and device
CN115825982A (en) * 2023-02-02 2023-03-21 深圳煜炜光学科技有限公司 Method and system for scanning point cloud data of unmanned aerial vehicle in rainy environment

Also Published As

Publication number Publication date
JP2021514457A (en) 2021-06-10
AU2017442202A1 (en) 2020-06-25
SG11202005246TA (en) 2020-07-29
TW201932868A (en) 2019-08-16
IL275162A (en) 2020-07-30

Similar Documents

Publication Publication Date Title
US11609329B2 (en) Camera-gated lidar system
WO2019112514A1 (en) Rain filtering techniques for autonomous vehicle
US11703879B2 (en) All weather autonomously driven vehicles
KR102614323B1 (en) Create a 3D map of a scene using passive and active measurements
EP2721593B1 (en) System and method for traffic side detection and characterization
CN110389586B (en) System and method for ground and free space exploration
US10664974B2 (en) System and method for object detection using edge characteristics
US20150378015A1 (en) Apparatus and method for self-localization of vehicle
US20150336575A1 (en) Collision avoidance with static targets in narrow spaces
Taraba et al. Utilization of modern sensors in autonomous vehicles
KR20120072131A (en) Context-aware method using data fusion of image sensor and range sensor, and apparatus thereof
US20160299229A1 (en) Method and system for detecting objects
CN108333589A (en) A kind of automatic driving vehicle obstacle detector
RU2763800C1 (en) Onboard sensor system
CN110412565A (en) Sensing system and automatic driving vehicle
US20180321377A1 (en) Method for capturing a surrounding region of a motor vehicle with object classification, control device, driver assistance system and motor vehicle
US20180004221A1 (en) Autonomous guidance system
KR101449288B1 (en) Detection System Using Radar
CN207851290U (en) A kind of automatic driving vehicle obstacle detector
KR101428281B1 (en) System for detecting road-kerb of veichle and method thereof
US11914679B2 (en) Multispectral object-detection with thermal imaging
EP4379424A1 (en) Multipath object identification for navigation
US11906623B1 (en) Velocity estimation using light detection and ranging (LIDAR) system
CN211032395U (en) Autonomous vehicle
WO2023105463A1 (en) A system and method for lidar blockage detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17934170

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020528321

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017442202

Country of ref document: AU

Date of ref document: 20171208

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 17934170

Country of ref document: EP

Kind code of ref document: A1