WO2019112514A1 - Rain filtering techniques for autonomous vehicle - Google Patents
Rain filtering techniques for autonomous vehicle Download PDFInfo
- Publication number
- WO2019112514A1 WO2019112514A1 PCT/SG2017/050607 SG2017050607W WO2019112514A1 WO 2019112514 A1 WO2019112514 A1 WO 2019112514A1 SG 2017050607 W SG2017050607 W SG 2017050607W WO 2019112514 A1 WO2019112514 A1 WO 2019112514A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lidar
- lidar device
- laser light
- reflected
- echoes
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/87—Combinations of systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
- G01S7/4815—Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
Definitions
- the present invention relates to a system and method of using lidar for autonomous vehicle navigation when particulate matter is present in the air such as rain or snow.
- An autonomous car is one that is capable of monitoring its environment and navigating without human input. Benefits of autonomous cars include greater vehicle safety, greater efficiency, lower costs, decreased congestion and greater mobility for those who are unfit or not licensed to driver.
- CMOS cameras can be used to detect objects and hazards in their surroundings.
- each type of sensor has limitations.
- CMOS cameras rely on adequate lighting to detect objects.
- the cameras can be ineffective when light is projected back by, for example, headlights of an approaching vehicle.
- Other shortcomings of cameras include inadequate lighting, motion blurring, limited field-of-view and/or detection range.
- Radar does not rely on visible light but lacks the resolution needed to identify smaller objects.
- Laser light can be used to overcome these limitations but can be ineffective during rain or snow.
- Lidar Light Imaging Detection and Ranging measures the distance to an object by illuminating it with a pulsed laser light and measuring the reflected pulses with a sensor.
- a lidar system reflects multiple laser pulses off of objects surrounding the vehicle. Differences in laser return times and wavelengths can then be used to make digital 3D representations of objects.
- lidar it is difficult for lidar to transmit through snow, rain, fog and dust in the air. Rain or other particles in the air can reflect laser light back to the system in the same manner as a solid object. As pulsed laser light is reflected back to a lidar system, an autonomous vehicle will unnecessarily slow or stop. This is due to the“false positive” detection of obstacles.
- lidar sensors with “multi-echo” or “last echo” technology can filter reflections caused by dust, rain and fog. They operate under the premise that a portion of the energy from a pulse may be reflected by particle matter. The remainder of the beam can continue to propagate and is reflected by a solid object. When this occurs, the lidar unit can ignore the closer, weaker reflections caused by rain or particulates in the air. While an improvement, this technique is not robust for filtering moderate to heavy rain.
- U.S. Patent No. 9,097,800 describes a method of combining radar with lidar to detect objects.
- the lidar is used to create a three-dimensional point map in an area surrounding the vehicle. Radar is then used to confirm that objects or hazards are solid materials rather than rain or airborne particles. While the system is designed to prevent false positive lidar detections, it requires the use of a separate radar system. Further, radar has inherently low resolution and can miss objects with weak reflectivity.
- U.S. Patent Application No. 14/576,265 also describes a method of using lidar to confirm the presence of objects near an autonomous vehicle.
- the lidar detects multiple points of an object (i.e. right, left, top and bottom) to confirm its presence.
- the approach is similar to multi-echo methods as it requires multiple pulses.
- U.S. Patent No. 9,097,800 it may not be effective in detecting small objects.
- the invention recognizes that there is a need for an improved system for autonomous vehicles to navigate and detect objects and hazards.
- the system should be reliable and capable of operating in all weather conditions.
- the invention includes a system and rain filtering algorithm to allow autonomous vehicles to operate in rain or snow.
- the invention includes a system for detecting object and/or navigating using laser light (i.e. lidar) as well as distinguishing solid objects from particles in the air.
- the system can include a first lidar device that scans an area by emitting pulses of laser light and detecting laser light that is reflected and a second lidar device that scans the same or substantially the same area by emitting pulses of laser light and detecting laser light that is reflected.
- a computer can use an algorithm to compare emitted laser light and reflected laser light detected by the first lidar device with emitted laser light and reflected laser light detected by the second lidar device to confirm the presence of a solid object.
- the presence of a solid object can be confirmed when the first lidar device and the second lidar device detect reflected laser light from the solid object.
- the first and second lidar devices can be mounted on an autonomous vehicle so that they are 0.5 to 2.5 meters apart from each other on a horizontal plane.
- the first lidar device and the second lidar device can utilize multi-echo technology to distinguish solid objects from airborne particles.
- the lidar devices can perform multiple plane scanning to distinguish solid objects from airborne particles.
- the invention also includes a method of detecting objects or obstacles for autonomous driving.
- a first lidar device scans an area near an autonomous vehicle by emitting laser light and detecting reflected laser light (i.e. echoes).
- a second lidar device scans an area near an autonomous vehicle by emitting laser light and detecting echoes. Emitted pulses and echoes detected by the first lidar device are compared with emitted pulses and echoes detected by the second lidar device using a computer and an algorithm to detect and confirm the presence of an object or obstacle. The presence of a solid object can be confirmed when both the first lidar unit and the second lidar unit detect echoes from the object.
- An echo that results from rain or snow can be identified by comparing echoes received by the first lidar device with echoes received by the second lidar device.
- the first lidar device and the second lidar device can use multi-echo scanning. Further, the lidar devices can scan a series of layers or planes. The presence of a solid object can be confirmed when the solid object appears in one or more layers or planes.
- the invention also includes a method of detecting solid objects comprised of the steps of (a) scanning an area near an autonomous vehicle by emitting pulses of laser light and detecting echoes with a first lidar unit, (b) scanning substantially the same area near by emitting pulses of laser light and detecting echoes with a second lidar unit and (c) using an algorithm to compare pulses emitted and echoes received by the first lidar unit with pulses emitted and echoes received by the second lidar unit to detect and confirm the presence of an object or obstacle.
- a false positive i.e. particulate matter in the air such as rain or snow
- the first lidar unit and the second lidar unit can use multi-echo scanning. They can also scan a series of planes. The presence of a solid object can be confirmed when the object appears in more than one plane. Furthermore, the presence of a solid object can be confirmed in the last echo of a multi-echo scan.
- the invention also includes a method for detecting and/or confirming the presence of a solid object using lidar comprised of the steps of (a) using a first lidar device to scan multiple planes, wherein pulses of laser light are emitted and reflected laser light is detected, (b) collecting data related to laser light emitted from the first lidar device and reflected laser light detected by the first lidar device, (c) determining whether reflected light from the first lidar device is the result of particulate matter in the air by comparing multiple planes for the presence and/or absence of reflected light, (d) using a second lidar device to scan multiple planes, wherein pulses of laser light are emitted and reflected laser light is detected, (e) collecting data related to laser light emitted from the second lidar device and reflected laser light detected by the second lidar device, (f) determining whether reflected light from the second lidar device is the result of particulate matter in the air by comparing multiple planes for the presence and/or absence of reflected laser light
- the multiple planes scanned by the first lidar device are in the same area or substantially the same area as the multiple planes scanned by the second lidar device.
- the presence of a solid object can be confirmed when reflected laser light is detected in four or more of the multiple planes of the first lidar device and/or four or more of the multiple planes of the second lidar device.
- the first lidar device and the second lidar device can use multi-echo scanning.
- a first aspect of the invention is an improved system and method for detecting obstacles for autonomous driving using Iidar.
- a second aspect of the invention is an improved system and method for autonomous driving that utilizes multiple Iidar systems.
- a third aspect of the invention is an improved system and method for autonomous driving that utilizes multiple Iidar systems and an algorithm to avoid falsely identifying water or particles in the air as solid objects.
- a fourth aspect of the invention is an improved system and method for autonomous driving that utilizes multi-echo technology, scans multiple layers and employs an algorithm to mitigate interference caused from rain or particles in the air.
- a fifth aspect of the invention is an improved system and method for autonomous driving that utilizes dual Iidar systems, multi-echo technology and multiple layer scanning with an algorithm to improve the robustness and mitigate interference caused by rain or particles in the air.
- FIG. 1A depicts a conventional multi-echo iidar system.
- FIG. 1 B depicts a multi-layer scan lidar wherein each scan layer is superimposed on a grid map.
- FIG. 2 depicts a grid obtained by a multi-layer scan lidar. A solid object will be present on multiple layers.
- FIG. 3A is a flowchart of an embodiment wherein a lidar sensor uses last echo filtering and a rain filter algorithm for obstacle feature detection.
- FIG. 3B depicts dual lidar units used to detect obstacles according to an embodiment of the invention.
- FIG. 3C is a flowchart of an embodiment of the invention wherein dual lidar sensors use last echo filtering and a rain filter algorithm to distinguish return signals caused by rain or particles in the air from those of solid objects.
- FIG. 4A depicts an autonomous vehicle with dual lidar sensors separated by a distance“d.”
- FIG. 4B depicts dual lidar sensors and their respective zones of detection.
- FIG. 5A is a plot of a single plane lidar.
- FIG. 5B is a plot of a single plane lidar using the dual lidar system.
- FIG. 5C is a plot of a multi-plane lidar.
- FIG. 5D is a plot of a multi-plane lidar using the dual lidar system.
- lidar While the invention is primarily described for the navigation of autonomous vehicles, it is understood that the invention is not so limited and can be used to assist with other endeavors that use lidar. Other applications include, for example, using the invention in other vehicles such as robots, drones or unmanned aircraft systems. The invention can also be used to improve the robustness of lidar in landscape imaging and mapping applications when particles such as rain or snow are present in the air.
- references in this specification to "one embodiment/aspect” or “an embodiment/aspect” means that a particular feature, structure, or characteristic described in connection with the embodiment/aspect is included in at least one embodiment/aspect of the disclosure.
- the use of the phrase “in one embodiment/aspect” or “in another embodiment/aspect” in various places in the specification are not necessarily all referring to the same embodiment/aspect, nor are separate or alternative embodiments/aspects mutually exclusive of other embodiments/aspects.
- various features are described which may be exhibited by some embodiments/aspects and not by others.
- various requirements are described which may be requirements for some embodiments/aspects but not other embodiments/aspects.
- Embodiment and aspect can be in certain instances be used interchangeably.
- multi-echo capability refers to the ability of a lidar unit to gather and evaluate multiple (e.g. three) echoes per transmitted laser pulse. Once an echo reaches the receiver of the unit, the received intensity is transformed into a voltage. An echo from a solid object will usually yield a high voltage over a long period of time. An echo of a rain drop, however, yields a very low voltage over a short period of time.
- multi-layer technology refers to a lidar unit that allows for a pitch angle compensation by means of multiple (e.g. four) scan planes with different vertical angles.
- the photo diode receiver of the unit include multiple (e.g. four) independent receivers arranged in a line. Each receivers scans a single plane, thus dividing the vertical aperture into multiple planes.
- SLAM refers to Simultaneous Localization And Mapping which enables accurate mapping where GPS localization is unavailable, such as indoor spaces.
- SLAM algorithms use LiDAR and IMU (Inertial Measurement Unit) data to simultaneously locate the sensor and generate a coherent map of its surroundings.
- Time-of-Flight Principle refers to a method for measuring the distance between a sensor and an object, based on the time difference between the emission of a signal and its return to the sensor, after being reflected by an object.
- Lidar is fundamentally a distance measuring technology.
- a lidar system sends light energy to the ground or toward an object. This emitted light can be referred to as a “beam” or“pulse.”
- the lidar unit measures light that is reflected back to the sensor. This reflected light can be referred to as the“echo” or“return.”
- the spatial distance between the lidar system and a measured point is calculated by comparing the delay between the pulse and return.
- Lidar is often a preferred system for use in autonomous vehicles because it can accurately map the three-dimensional surroundings of a vehicle to a high resolution.
- rain or particulate matter in the air can render lidar ineffective as the particles reflect laser light back to the lidar unit.
- an autonomous vehicle will slow or stop as the airborne particulates are indistinguishable from solid objects.
- FIG. 1A depicts a lidar unit 110 using last echo filtering to detect an object 120.
- the technique is also referred to as multi-echo scanning.
- Rain 115 is depicted between the lidar unit 110 and an object 120.
- One pulse can generate multiple echoes (i.e. first echo, intermediate echo, last echo) of laser light to the lidar unit. This can occur when particles in the air such as rain reflect laser light back to the lidar unit 110.
- a first echo 130 is depicted wherein the signal is reflected from particles in the air. Intermediate echoes can also occur as a result of particles in the air.
- a last echo 140 is depicted wherein the signal (i.e. echo) is reflected from a solid object 120.
- the system can analyze echoes to identify those that are a result of rain or particles in the air. Using a“last echo filter,” the first and intermediate echoes are attributed to rain or particles in the air. The last echo is attributed to a solid object.
- this technique has limitations and is ineffective in heavy rain or with substantial particulate matter in the air.
- FIG. 1 B depicts multi-layer scanning, another technique aimed at identifying echoes or returns from rain 115 or particulate matter in the air.
- a lidar unit 110 scans multiple angles.
- a layer 130, plane or stacked plane can be attributed a scan of each angle.
- the system can compare data from the layers to confirm the presence or absence of an object. An object that appears on a single layer will likely be attributed to a particle in the air. In a preferred method, the presence of a solid object is confirmed when echoes are detected in four or more adjacent layers.
- Each scan angle/layer can be superimposed on a grid map as depicted in FIG. 2.
- the results of a scan are represented on four layers. Of four scanned angles, each angle is depicted on a horizontal grid. Each shaded square indicates that the lidar unit detected an echo.
- a conventional lidar system will indicate the presence of two objects, 215 and 225. Analysis of data from multiple layers can distinguish a false positive reading from a true reading.
- the obstacles can be identified as a particle in the air such as rain.
- the first object 215 is present in multiple layers, it is absent in an intermediate layer (Scan Layer 3). Further, its location varies among layers. In this circumstance, the system can conclude that the echoes are attributable to rain or particles in the air. In contrast, the second object 225 is present in multiple consecutive layers at the same location. With this data, the system can confirm that signals are attributable to a solid object. In a preferred method, an echo in four cells is used to confirm the presence of an object or obstacle. If the object is in three or fewer cells, it is attributed to rain or airborne particulates.
- FIG. 3A is a flowchart that depicts a lidar unit that employs both last echo filtering and a rain filter algorithm.
- the lidar unit can use a last echo filter 305 and a rain filter algorithm 315 that analyzes data from multi-layer scanning for obstacle feature and detection 325. This improves the reliability and robustness of lidar when airborne particles such as rain are present.
- FIG. 3B depicts an embodiment of the invention wherein two lidar units function concurrently to improve the robustness of an autonomous vehicle navigation system and allow it to operate during rainy or snow conditions.
- the system can include a first (i.e. left) lidar unit 210 and a second (i.e. right) lidar unit 220 that scan substantially the same area. Data can be collected from each lidar unit and compared using an algorithm to confirm that a signal is the result of a solid object.
- the system confirms the presence of an object using the second lidar unit.
- the system can use an algorithm to analyze data from each lidar unit which includes emitted light, reflected light (i.e. echoes) as well as the angles of reflection and movement of objects.
- the second lidar unit confirms that reflected light received by the first lidar unit is the result of a solid object.
- a reflected pulse (i.e. echo) that is detected by one of the units, without confirmation from the other, can be attributed to a particle in the air and classified as a false positive.
- lidar units 210, 220
- An echo that is detected by the left lidar unit 210 that is not detected by right lidar unit 220 can be attributed to a particle in the air such as rain.
- An echo that is detected by the right lidar unit 220 that is not detected by left lidar unit 210 can be attributed to a particle in the air such as rain.
- FIG. 3B depicts a first echo 130 along with a last echo 140.
- FIG. 3C is a flowchart that depicts a system of two lidar units according to an embodiment of the invention.
- the first lidar unit 210 and the second lidar unit 220 can use last echo filtering (310, 320) Data from each unit is collected, processed and analyzed using a rain filter algorithm 330.
- the algorithm can, for example, attribute an echo to rain if it is detected by one lidar unit rather than the pair. This leads to more robust obstacle and feature detection 340.
- An autonomous vehicle can avoid objects/hazards without slowing and/or stopping due to false positive signals from rain or particles in the air.
- FIG. 4A depicts a preferred arrangement of the system wherein a first lidar unit 210 and a second lidar unit 220 are mounted on the roof of an automobile 410 and separated by a distance“d” from one another.
- the distance“d” can be determined empirically and is preferably between 0.5 meters and 2.5 meters.
- the lidar units can be separated by a distance according to variables such as the vehicle size, resolution of the lidar units, anticipated distance of objects/hazards and the amount of particulate matter in the air.
- the ideal distance can vary from, for example, 1 cm for a small mobile robot to several meters for a large drone.
- first lidar unit 210 and a second lidar unit 220 can be mounted at other external positions on the vehicle (e.g. the bumper of hood).
- the lidar units are position on a horizontal plane with one another so that laser beams are emitted on the same planes.
- FIG. 4B depicts the zones of detection 400 of a first (i.e. left) lidar unit 210 and a second (i.e. right) lidar unit 220.
- Each system scans a substantially circular area to detect an object 120. The scans overlap in the center area 235.
- the number and direction of the rays emitted by each sensor can vary based on the lidar unit and the setting.
- Rain or particles in the air can reflect laser light to the lidar units.
- a solid object 120 is detected from laser light reflected to both lidar units.
- the two lidar units function concurrently and echoes are analyzed Multi-uni technology can be combined with last echo filtering techniques and multi-layer scanning to improve the robustness of autonomous vehicle navigation and allow operation during conditions where particles such as rain or snow are present in the air.
- Lidar units were affixed to the roof of a vehicle at a distance of approximately one meter from each other.
- Two types of lidar were used: a single plane (LMS151 unit) and a four-plane (LDMRS unit).
- the lidar units used multi-echo technology.
- FIG. 5A - 5D are top view representations of the environment with an autonomous vehicle at the center.
- the horizontal axis represents the distance away from the front and rear of the vehicle in meters.
- the vertical axis represents the distance away from the sides of the vehicle in meters. Each segment depicts a distance of 5 meters.
- FIG. 5A is a plot of a single plane lidar using multi-echo scanning.
- the circled areas are known locations of solid objects (pillars in the rain tunnel).
- the solid objects are detected along with additional areas from false positives from rain drops 505. Most of the false-positives are detected in the area in front of the vehicle 505.
- FIG. 5B is a plot of a single plane lidar with data from the dual lidar system. Only scan points detected by both the first lidar system and the second lidar system are included.
- the horizontal axis represents the distance away from the front and rear of the vehicle.
- the vertical axis represents the distance away from the sides of the vehicle.
- Return signals attributable to rain can be identified by comparing FIG. 5A and FIG. 5B. Without the dual lidar system, false positives are present and attributed to echoes from rain drops in the air 505. These false positives are absent with the use of the dual lidar system.
- the circled areas are structural objects of (e.g. pillars) that are present and detected in both tests.
- FIG. 5C and FIG. 5D demonstrate the use of a multi-plane lidar filter in detecting false positive echoes.
- FIG. 5C is a plot of a multi-plane lidar (unfiltered). The horizontal axis represents the distance away from the front and rear of the vehicle. The vertical axis represents the distance away from the sides of the vehicle. Each segment depicts a distance of 5 meters. Points along the horizontal axis indicate the detection of false positives due to rain.
- FIG. 5D is a plot of a multi-plane lidar that uses a dual lidar system. Many of the plotted points are absent with the use of a dual lidar filter. These points are attributed to rain drops in the air. The points that are present in both FIG. 5C and FIG. 5D can be attributed to physical objects (i.e. features).
- Additional modifications/components can improve the robustness of the system in heavy rain (e.g. rain of more than 20 mm/hour).
- One approach is to use lidar units capable of analyzing additional layers (e.g. five or more layers) for better penetration in rain.
- Lidar units with higher refresh rates can also be used to filter out rain droplets.
- a mechanical air blower can be implemented to provide an air curtain with a range (e.g. up to one meter) to reduce the amount of rain drop being detected by the lidar units.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020528321A JP2021514457A (en) | 2017-12-08 | 2017-12-08 | Rain filtering techniques for autonomous vehicles |
PCT/SG2017/050607 WO2019112514A1 (en) | 2017-12-08 | 2017-12-08 | Rain filtering techniques for autonomous vehicle |
AU2017442202A AU2017442202A1 (en) | 2017-12-08 | 2017-12-08 | Rain filtering techniques for autonomous vehicle |
SG11202005246TA SG11202005246TA (en) | 2017-12-08 | 2017-12-08 | Rain filtering techniques for autonomous vehicle |
TW107144162A TW201932868A (en) | 2017-12-08 | 2018-12-07 | Rain filtering techniques for autonomous vehicle |
IL275162A IL275162A (en) | 2017-12-08 | 2020-06-07 | Rain filtering techniques for autonomous vehicles |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/SG2017/050607 WO2019112514A1 (en) | 2017-12-08 | 2017-12-08 | Rain filtering techniques for autonomous vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019112514A1 true WO2019112514A1 (en) | 2019-06-13 |
Family
ID=66751135
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SG2017/050607 WO2019112514A1 (en) | 2017-12-08 | 2017-12-08 | Rain filtering techniques for autonomous vehicle |
Country Status (6)
Country | Link |
---|---|
JP (1) | JP2021514457A (en) |
AU (1) | AU2017442202A1 (en) |
IL (1) | IL275162A (en) |
SG (1) | SG11202005246TA (en) |
TW (1) | TW201932868A (en) |
WO (1) | WO2019112514A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112859010A (en) * | 2019-11-28 | 2021-05-28 | 怡利电子工业股份有限公司 | Alarming method for preventing raindrop misinformation of millimeter wave radar |
TWI732365B (en) * | 2019-11-29 | 2021-07-01 | 怡利電子工業股份有限公司 | Warning method for preventing raindrop false alarm of millimeter wave radar |
CN115825982A (en) * | 2023-02-02 | 2023-03-21 | 深圳煜炜光学科技有限公司 | Method and system for scanning point cloud data of unmanned aerial vehicle in rainy environment |
US11640170B1 (en) | 2019-10-29 | 2023-05-02 | Zoox, Inc. | Identification of particulate matter in sensor data |
US11643072B2 (en) * | 2019-09-27 | 2023-05-09 | Zoox, Inc. | Planning accommodations for particulate matter |
EP4130798A4 (en) * | 2020-04-15 | 2023-05-31 | Huawei Technologies Co., Ltd. | Target identification method and device |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2024052332A (en) * | 2022-09-30 | 2024-04-11 | 株式会社小松製作所 | Work site detection system and work site detection method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5314037A (en) * | 1993-01-22 | 1994-05-24 | Shaw David C H | Automobile collision avoidance system |
JP2013167479A (en) * | 2012-02-14 | 2013-08-29 | Toyota Motor Corp | Laser radar device and object detection method using the same |
US20150009485A1 (en) * | 2013-07-02 | 2015-01-08 | Electronics And Telecommunications Research Institute | Laser radar system |
CN204882872U (en) * | 2015-09-09 | 2015-12-16 | 厦门理工学院 | To keeping away barrier system before car |
CN105607075A (en) * | 2015-09-08 | 2016-05-25 | 北京铁路局北京科学技术研究所 | Road safety monitoring method and apparatus thereof |
CA2980305A1 (en) * | 2015-03-25 | 2016-09-29 | Waymo Llc | Vehicle with multiple light detection and ranging devices (lidars) |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009069020A (en) * | 2007-09-13 | 2009-04-02 | Toshiba Corp | Radar device |
JP5092076B2 (en) * | 2007-10-26 | 2012-12-05 | オプテックス株式会社 | Laser area sensor |
JP2016191617A (en) * | 2015-03-31 | 2016-11-10 | シャープ株式会社 | Obstacle determination device |
JP6598517B2 (en) * | 2015-06-04 | 2019-10-30 | 東日本高速道路株式会社 | Reverse running vehicle detection system using light waves |
-
2017
- 2017-12-08 WO PCT/SG2017/050607 patent/WO2019112514A1/en active Application Filing
- 2017-12-08 SG SG11202005246TA patent/SG11202005246TA/en unknown
- 2017-12-08 AU AU2017442202A patent/AU2017442202A1/en not_active Abandoned
- 2017-12-08 JP JP2020528321A patent/JP2021514457A/en not_active Ceased
-
2018
- 2018-12-07 TW TW107144162A patent/TW201932868A/en unknown
-
2020
- 2020-06-07 IL IL275162A patent/IL275162A/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5314037A (en) * | 1993-01-22 | 1994-05-24 | Shaw David C H | Automobile collision avoidance system |
JP2013167479A (en) * | 2012-02-14 | 2013-08-29 | Toyota Motor Corp | Laser radar device and object detection method using the same |
US20150009485A1 (en) * | 2013-07-02 | 2015-01-08 | Electronics And Telecommunications Research Institute | Laser radar system |
CA2980305A1 (en) * | 2015-03-25 | 2016-09-29 | Waymo Llc | Vehicle with multiple light detection and ranging devices (lidars) |
CN105607075A (en) * | 2015-09-08 | 2016-05-25 | 北京铁路局北京科学技术研究所 | Road safety monitoring method and apparatus thereof |
CN204882872U (en) * | 2015-09-09 | 2015-12-16 | 厦门理工学院 | To keeping away barrier system before car |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11643072B2 (en) * | 2019-09-27 | 2023-05-09 | Zoox, Inc. | Planning accommodations for particulate matter |
US11640170B1 (en) | 2019-10-29 | 2023-05-02 | Zoox, Inc. | Identification of particulate matter in sensor data |
CN112859010A (en) * | 2019-11-28 | 2021-05-28 | 怡利电子工业股份有限公司 | Alarming method for preventing raindrop misinformation of millimeter wave radar |
TWI732365B (en) * | 2019-11-29 | 2021-07-01 | 怡利電子工業股份有限公司 | Warning method for preventing raindrop false alarm of millimeter wave radar |
EP4130798A4 (en) * | 2020-04-15 | 2023-05-31 | Huawei Technologies Co., Ltd. | Target identification method and device |
CN115825982A (en) * | 2023-02-02 | 2023-03-21 | 深圳煜炜光学科技有限公司 | Method and system for scanning point cloud data of unmanned aerial vehicle in rainy environment |
Also Published As
Publication number | Publication date |
---|---|
JP2021514457A (en) | 2021-06-10 |
AU2017442202A1 (en) | 2020-06-25 |
SG11202005246TA (en) | 2020-07-29 |
TW201932868A (en) | 2019-08-16 |
IL275162A (en) | 2020-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11609329B2 (en) | Camera-gated lidar system | |
WO2019112514A1 (en) | Rain filtering techniques for autonomous vehicle | |
US11703879B2 (en) | All weather autonomously driven vehicles | |
KR102614323B1 (en) | Create a 3D map of a scene using passive and active measurements | |
EP2721593B1 (en) | System and method for traffic side detection and characterization | |
CN110389586B (en) | System and method for ground and free space exploration | |
US10664974B2 (en) | System and method for object detection using edge characteristics | |
US20150378015A1 (en) | Apparatus and method for self-localization of vehicle | |
US20150336575A1 (en) | Collision avoidance with static targets in narrow spaces | |
Taraba et al. | Utilization of modern sensors in autonomous vehicles | |
KR20120072131A (en) | Context-aware method using data fusion of image sensor and range sensor, and apparatus thereof | |
US20160299229A1 (en) | Method and system for detecting objects | |
CN108333589A (en) | A kind of automatic driving vehicle obstacle detector | |
RU2763800C1 (en) | Onboard sensor system | |
CN110412565A (en) | Sensing system and automatic driving vehicle | |
US20180321377A1 (en) | Method for capturing a surrounding region of a motor vehicle with object classification, control device, driver assistance system and motor vehicle | |
US20180004221A1 (en) | Autonomous guidance system | |
KR101449288B1 (en) | Detection System Using Radar | |
CN207851290U (en) | A kind of automatic driving vehicle obstacle detector | |
KR101428281B1 (en) | System for detecting road-kerb of veichle and method thereof | |
US11914679B2 (en) | Multispectral object-detection with thermal imaging | |
EP4379424A1 (en) | Multipath object identification for navigation | |
US11906623B1 (en) | Velocity estimation using light detection and ranging (LIDAR) system | |
CN211032395U (en) | Autonomous vehicle | |
WO2023105463A1 (en) | A system and method for lidar blockage detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17934170 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020528321 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017442202 Country of ref document: AU Date of ref document: 20171208 Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17934170 Country of ref document: EP Kind code of ref document: A1 |