CN114895323A - Method for classifying precipitation by means of a lidar system - Google Patents

Method for classifying precipitation by means of a lidar system Download PDF

Info

Publication number
CN114895323A
CN114895323A CN202210092775.0A CN202210092775A CN114895323A CN 114895323 A CN114895323 A CN 114895323A CN 202210092775 A CN202210092775 A CN 202210092775A CN 114895323 A CN114895323 A CN 114895323A
Authority
CN
China
Prior art keywords
precipitation
lidar system
data point
data points
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210092775.0A
Other languages
Chinese (zh)
Inventor
M·维希曼
M·卡米尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of CN114895323A publication Critical patent/CN114895323A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/95Lidar systems specially adapted for specific applications for meteorological use
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/487Extracting wanted echo signals, e.g. pulse detection
    • G01S7/4876Extracting wanted echo signals, e.g. pulse detection by removing unwanted signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/493Extracting wanted echo signals

Abstract

A method for classifying precipitation by means of a lidar system is described, which method comprises the following steps: a) detecting, with a lidar system, a data point cloud comprising a plurality of data points; b) performing at least one filtering action on at least one portion of the data point cloud; c) classifying data points of the filtered portion of the data point cloud as precipitation. Furthermore, a corresponding device, a corresponding computer program and a corresponding machine-readable storage medium are described.

Description

Method for classifying precipitation by means of a lidar system
Technical Field
The invention relates to a method for classifying precipitation by means of a lidar system. A corresponding device, a corresponding computer program and a corresponding machine-readable storage medium are also described.
Background
The lidar system can be based on different measurement methods. Atmospheric weather events such as rain, snow and fog which in use cause major disturbances to the operational capabilities of the lidar system. The weather event may significantly reduce visibility, for example due to scattering and absorption of laser light, and thus make identification of objects difficult. For the use of lidar systems as environmental sensors which are important for safety in the context of automated driving, it is highly important to detect atmospheric weather events within the field of view of the lidar system in a timely and sufficiently reliable manner. This information can be used to continuously assess system performance in use, to identify system degradation in time and report it as an error report to the system interface. For example, suppression of the speed of an autonomously traveling vehicle may be caused in the case of detecting performance degradation based on an atmospheric weather event.
The identification of system performance degradation is advantageously such that the lidar sensor is able to evaluate and characterize weather events during driving. It is particularly expedient here to unambiguously recognize disturbances in the field of view as weather events only, and to classify different atmospheric weather conditions.
Document EP 3451021 a1 relates to a measuring tool with scanning functionality for optical measurements of the surroundings, wherein the measuring device has a sensor which has a microcell assembly as a receiving surface and, depending on the emission direction of the emitted radiation, defines a direction-dependent active partial region of the receiver in order to coordinate the active receiver surface with the changing imaging position of the received radiation.
Document WO 2018/127789 a1 relates to a method or a device for determining a roadway surface by means of a vehicle camera system, comprising the following steps: at least one image of the surroundings of the vehicle is captured by means of a vehicle camera system. The at least one image is evaluated in order to determine a state characteristic of precipitation and/or the presence of a traffic lane surface when the vehicle or another vehicle having the vehicle camera system is driving through the traffic lane. The determined state features are taken into account when determining the roadway surface. The result of the determination of the roadway surface or the friction coefficient estimate derived therefrom can preferably be output to a driver assistance function, a vehicle control function, or also as information to the driver.
Disclosure of Invention
A method for classifying precipitation by means of a lidar system is disclosed. The method comprises the following steps:
a) detecting, with the lidar system, a data point cloud comprising a plurality of data points;
b) performing at least one filtering action on at least one portion of the data point cloud;
c) classifying data points of the filtered portion of the data point cloud as precipitation.
In this case, a data point cloud comprising a plurality of data points is detected by means of a lidar system. At least one filtering action is then performed on at least one portion of the data point cloud. The filtering action can be performed on the whole data point cloud. Data points of the filtered portion of the data point cloud are then classified as precipitation.
This method is advantageous because it allows classification of precipitation and thus determination of important influencing factors for the operational capacity of the lidar system, which for example reduce visibility and make identification of objects difficult. The reliability of the information provided by the lidar system is thus improved. Furthermore, an estimation of precipitation rate may be achieved, e.g. to estimate whether precipitation is more likely to be more or less likely. The lidar system may additionally provide this information, which improves the functionality of the lidar system.
The method may be implemented, for example, in a computer-implemented manner.
The invention is illustrated by the measures cited in the further advantageous embodiments.
Expediently, the at least one filtering action comprises spatially high-pass filtering and/or temporally differencing the data points. Differentiating in time may, for example, mean subtracting the currently detected data point from the immediately preceding detected data point. This is advantageous because particularly data points which are characteristic of precipitation are retained by the filtering. The classification is thus simplified.
The above steps are repeated in accordance with the destination, which allows the speed of precipitation to be determined. This is advantageous as the type of precipitation can be determined from the falling speed of the precipitation.
And according with the destination, solving the type of the precipitation according to the falling speed. This is advantageous because it can be estimated, for example, whether the precipitation is snow with a smaller drop velocity than rain.
In accordance with the destination, the data points classified as precipitation are tracked through multiple execution of the above steps to find the precipitation speed. This is advantageous in improving the accuracy of determining the landing speed.
In line with the destination, the aspect of the random distribution of data points classified as precipitation is examined in order to increase the confidence of the correct classification of the data points to be classified as precipitation. The probability of correct classification can thus be increased, or erroneously classified data points can be specifically excluded.
The subject of the invention is furthermore also a device for classifying precipitation, comprising a lidar system arranged for carrying out the steps of the above-mentioned method. This is advantageous in that a device is thus provided which is capable of achieving the advantages described above.
The subject of the invention is additionally a computer program comprising instructions for causing the above-mentioned apparatus to carry out the steps of the above-mentioned method. This is advantageous because it can thus be implemented simply in a computer-implemented manner.
The subject matter of the invention is furthermore a machine-readable storage medium on which the above-described computer program is stored.
Drawings
Advantageous embodiments of the invention are shown in the drawings and are further elucidated in the following description.
The figures show:
FIG. 1: a flow chart of a method according to the invention according to a first embodiment;
FIG. 2: a flow chart of a method according to the invention according to a second embodiment;
FIG. 3: a flow chart of a method according to the invention according to a third embodiment;
FIG. 4: a schematic of a plurality of data point clouds generated in a method according to the invention;
FIG. 5: schematic illustration of an apparatus according to the invention according to an embodiment.
Detailed Description
Throughout the drawings, the same reference numerals indicate the same apparatus components or the same method steps.
Fig. 1 shows a flowchart of the method according to the invention for classifying precipitation according to a first embodiment, wherein a data point cloud is detected by means of a lidar system in a first step S11. The data point cloud is composed of a plurality of data points.
In a second step S12, at least one filtering action is performed on at least one portion of the detected data point cloud, for example high pass filtering in space and/or differentiating in time if there are already other data point clouds from the previous execution.
The data points of the filtered portion are then classified as precipitation in a third step S13.
Fig. 2 shows a flow chart of the inventive method according to a second embodiment. In a first step S21, a data point cloud consisting of 3D data points is detected by the lidar system. The data point cloud contains associated 3D information, including reflections by 3D objects from the environment that are of interest for environmental detection and reflections caused by atmospheric particles.
A high pass filtering is used in a second step S22 on the data point cloud. Spatially high-pass filters are known from image processing theory, which are suitable means for distinguishing spatially rapidly changing image content, such as raindrops, from spatially slowly changing image content, such as houses.
After the high-pass filtering, two new data point clouds are generated in a third step S23. The first data point cloud describes spatially slowly changing 3D objects in the field of view of the lidar system, while the second data point cloud describes spatially separated, randomly distributed precipitation particles in the field of view of the lidar system. Depending on the object size of the 3D object in the field of view of the lidar system, especially small objects, object limbs, or object edges may be misclassified as precipitation. An additional, spatial low-pass filtering of the image before the high-pass filtering can improve the distinction between the above-mentioned objects. This may be performed as an additional step before the third step S23.
In a fourth step S24, the data points of the second data point cloud are compared with a common model of the 3D object, for example a vehicle or a bicycle. Because of the random distribution of the data points of the second data point cloud, no meaningful comparison can be carried out here, and the data points can therefore be classified as precipitation.
Precipitation typically falls through the point cloud in a vertical direction faster than other objects or particles. The high speed of light and the scanning speed of the lidar system allow for an approximately instantaneous photographing of the field of view of the lidar system with approximately stationary precipitation particles in the field of view. Thus, if multiple detected data point clouds are observed, an animated forward movement of precipitation particles is produced due to the differences between the data point clouds. The landing speed is thus determined in a fifth step S25, and due to the typically known time difference between the sampling steps of the lidar system. The speed of fall is an indicator of whether the precipitation is more likely to be fast falling precipitation such as hail or large raindrops or slow falling precipitation such as snow. Additionally, slowly falling precipitation is more likely to follow a horizontal trajectory and can therefore also be classified according to this characteristic.
In a sixth step S26, redundant information about the precipitation rate is determined from the temporal tracking of the data points classified as precipitation in the field of view of the spatially separated precipitation particles or lidar system over a plurality of detected data point clouds, from the knowledge of the respectively observed detection time periods and from the size of the detected volume in the field of view of the lidar system known from the system design of the lidar system. The sixth step S26 may also be omitted here.
Fig. 3 shows a flow chart of the inventive method according to a third embodiment. In a first step S31, a data point cloud consisting of 3D data points is detected by the lidar system. The data point cloud contains associated 3D information, including reflections by 3D objects from the environment that are of interest for environmental detection and reflections caused by atmospheric particles.
In a second step S32, the data point cloud just detected is mathematically subtracted from the data point cloud that has been detected, for example, in a previous method execution (e.g., in an immediately preceding detection). This is equivalent to differentiating in time.
The time interval between the data point clouds used in the differentiation can be selected, for example, adaptively according to the falling speed of the precipitation.
If this method is used continuously, for example in the sense of a sliding (gleitenden) algorithm, for the respectively detected data point clouds, new data point clouds are respectively generated by means of sliding differentiation, which have only rapidly changing data points. The remaining data points after the subtraction of the two data point clouds are only those data points which have changed in the respective examination. These data points are, to a large extent, reflections of the lidar signal off precipitation particles.
In a third step S33, the data points of the resulting data point cloud are compared with a common model of a 3D object, for example a vehicle or a bicycle. Because of the random distribution of the data points of the resulting data point cloud, no meaningful comparison can be carried out here, and the data points can therefore be classified as precipitation.
Precipitation typically falls through the point cloud in a vertical direction faster than other objects or particles. The high speed of light and the scanning speed of the lidar system allow for an approximately instantaneous photographing of the field of view of the lidar system with approximately stationary precipitation particles in the field of view. Thus, if multiple detected data point clouds are observed, an animated forward movement of precipitation particles is produced due to the differences between the data point clouds. The landing speed is thus determined in a fourth step S24, and due to the typically known time difference between the sampling steps of the lidar system. The speed of fall is an indicator of whether the precipitation is more likely to be fast falling precipitation such as hail or heavy rain drops or slow falling precipitation such as snow. Additionally, slowly falling precipitation is more likely to follow a horizontal trajectory and can therefore also be classified according to this characteristic.
In a fifth step S25, redundant information about the precipitation rate is determined from the temporal tracking of the data points classified as precipitation in the field of view of the spatially separated precipitation particles or lidar system over a plurality of detected data point clouds, from the knowledge of the respectively observed detection time periods and from the size of the detected volume in the field of view of the lidar system known from the system design of the lidar system. The fifth step S25 may also be omitted here.
To improve the confidence regarding the correct classification as precipitation, methods such as euclidean distance or manhattan distance may be used to demonstrate a random distribution of corresponding data points classified as precipitation without constituting a coherent object.
Fig. 4 shows a schematic representation of a plurality of data point clouds produced in a method according to the invention. The first data point cloud 41 represents a data point cloud detected by the lidar system, which, as schematically represented here, comprises, for example, cars, persons and houses, and precipitation particles 411 as 3D objects.
The second data point cloud 42, which now shows only extended objects, here cars, people and houses, results from the corresponding low-pass filtering of the first data point cloud 41. Precipitation particles 411 have been removed by low pass filtering.
The third data point cloud 43, which now also contains only precipitation particles 411, results from a corresponding high-pass filtering of the first data point cloud 41. The corresponding data point can therefore be classified as precipitation.
Fig. 5 shows a schematic view of the inventive device 50 for classifying precipitation according to an embodiment. The device 50 comprises at least one lidar system 51, which is provided for carrying out the method according to the invention.

Claims (9)

1. A method for classifying precipitation by means of a lidar system (51), the method comprising the steps of:
a) detecting a data point cloud (41) comprising a plurality of data points by means of the lidar system (51);
b) performing at least one filtering action on at least one portion of the data point cloud (41);
c) classifying data points of the filtered portion of the data point cloud (41) as precipitation.
2. The method of claim 1, wherein the at least one filtering action in step b) comprises spatially high pass filtering and/or temporally differencing data points.
3. Method according to any of the preceding claims, wherein the falling speed of the precipitation is found by repeating the steps a), b) and c).
4. Method according to the preceding claim, wherein the type of precipitation is determined from the falling speed.
5. Method according to any one of the preceding claims, wherein the data points that have been classified as precipitation are tracked over a plurality of executions of the steps a), b) and c) in order to find the falling speed of the precipitation.
6. The method of any one of the preceding claims, wherein a stochastic distribution of data points classified as precipitation is studied in order to increase the confidence of a correct classification of the data points classified as precipitation.
7. An apparatus (50) for classifying precipitation, the apparatus comprising a lidar system (51), the lidar system (51) being arranged for carrying out the steps of the method according to any of claims 1 to 6.
8. A computer program comprising instructions for causing an apparatus (50) as defined in claim 7 to carry out all the steps of the method as defined in any one of claims 1 to 6.
9. A machine-readable storage medium on which a computer program according to claim 8 is stored.
CN202210092775.0A 2021-01-26 2022-01-26 Method for classifying precipitation by means of a lidar system Pending CN114895323A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021200652.3A DE102021200652A1 (en) 2021-01-26 2021-01-26 Procedure for classifying precipitation using a LiDAR system
DE102021200652.3 2021-01-26

Publications (1)

Publication Number Publication Date
CN114895323A true CN114895323A (en) 2022-08-12

Family

ID=82320837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210092775.0A Pending CN114895323A (en) 2021-01-26 2022-01-26 Method for classifying precipitation by means of a lidar system

Country Status (2)

Country Link
CN (1) CN114895323A (en)
DE (1) DE102021200652A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10427645B2 (en) 2016-10-06 2019-10-01 Ford Global Technologies, Llc Multi-sensor precipitation-classification apparatus and method
CN117310741A (en) 2017-01-03 2023-12-29 应诺维思科技有限公司 Lidar system and method for detecting and classifying objects
EP3451021A1 (en) 2017-08-30 2019-03-06 Hexagon Technology Center GmbH Measuring device with scan functionality and adjustable receiving areas of the receiver
DE102019116100A1 (en) 2019-06-13 2020-12-17 Valeo Schalter Und Sensoren Gmbh Method for differentiating between precipitation and spray in one based on a point cloud

Also Published As

Publication number Publication date
DE102021200652A1 (en) 2022-07-28

Similar Documents

Publication Publication Date Title
CN110501719B (en) Laser radar-based train obstacle detection method
CN109212514B (en) Continuous tracking and associating method for moving and static targets by radar detection equipment
CN107144839B (en) Detecting long objects by sensor fusion
Dannheim et al. Weather detection in vehicles by means of camera and LIDAR systems
KR101534927B1 (en) Apparatus and method for recognizing of vehicle
CN104903915B (en) Method for monitoring the method and apparatus of the ambient enviroment of vehicle and for implementing emergency braking
EP3403216B1 (en) Systems and methods for augmenting upright object detection
US20200256999A1 (en) Lidar techniques for autonomous vehicles
CN110998470A (en) System, method, and processor readable medium for automated driving vehicle reliability assessment
JP2007523427A (en) Apparatus and method for detecting passing vehicles from a dynamic background using robust information fusion
JP4858761B2 (en) Collision risk determination system and warning system
CN107103275B (en) Wheel-based vehicle detection and tracking using radar and vision
CN112193208A (en) Vehicle sensor enhancement
US10275665B2 (en) Device and method for detecting a curbstone in an environment of a vehicle and system for curbstone control for a vehicle
CN112613424A (en) Rail obstacle detection method, rail obstacle detection device, electronic apparatus, and storage medium
JP6263453B2 (en) Momentum estimation device and program
CN114675295A (en) Method, device and equipment for judging obstacle and storage medium
Courcelle et al. On the importance of quantifying visibility for autonomous vehicles under extreme precipitation
CN114895323A (en) Method for classifying precipitation by means of a lidar system
CN116587978A (en) Collision early warning method and system based on vehicle-mounted display screen
JP6816163B2 (en) A driver assistance system comprising a method of capturing at least one object, a sensor device device, a sensor device and at least one sensor device.
CN105730330A (en) Traffic safety system and barrier screening method thereof
CN107256382A (en) Virtual bumper control method and system based on image recognition
TWI618647B (en) System and method of detection, tracking and identification of evolutionary adaptation of vehicle lamp
CN114537474A (en) Protection method and device for train running safety

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination