US20230122788A1 - Method and device for the recognition of blooming in a lidar measurement - Google Patents

Method and device for the recognition of blooming in a lidar measurement Download PDF

Info

Publication number
US20230122788A1
US20230122788A1 US17/920,071 US202117920071A US2023122788A1 US 20230122788 A1 US20230122788 A1 US 20230122788A1 US 202117920071 A US202117920071 A US 202117920071A US 2023122788 A1 US2023122788 A1 US 2023122788A1
Authority
US
United States
Prior art keywords
lidar
measurement
blooming
distance value
passive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/920,071
Other versions
US11619725B1 (en
Inventor
Martin Meinke
David Peter
Sebastian Buck
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Mercedes Benz Group AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mercedes Benz Group AG filed Critical Mercedes Benz Group AG
Assigned to Mercedes-Benz Group AG reassignment Mercedes-Benz Group AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MEINKE, MARTIN, PETER, DAVID, BUCK, SEBASTIAN
Application granted granted Critical
Publication of US11619725B1 publication Critical patent/US11619725B1/en
Publication of US20230122788A1 publication Critical patent/US20230122788A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • Exemplary embodiments of the invention relate to a method for the recognition of blooming in a lidar measurement, as well as to a device for the recognition of blooming in a lidar measurement having at least one lidar.
  • DE 10 2005 003 970 A1 discloses a method for determining functionality of a sensor arrangement on a motor vehicle, wherein a region captured by the sensor arrangement is divided into different sub-regions, and sensor signals allocated to one sub-region, from one particular surrounding area, are analyzed for determining the functionality of the sensor arrangement.
  • sensor signals which are captured one after the other for different sub-regions when passing the particular surrounding area, are analyzed.
  • the sub-regions are capture regions of different lidar sensors, or different angular sectors of a lidar sensor.
  • an assistance system of a vehicle is known from DE 10 2018 003 593 A1, wherein the vehicle is moved in the autonomous driving mode by means of the assistance system, and the assistance system comprises an environment sensor, having a number of capture units arranged in and/or on the vehicle.
  • the autonomous driving mode of the vehicle an environment of the vehicle, and objects located within it, are captured by means of the capture units, wherein a function of the individual capture units is constantly monitored by means of a monitoring module, and in the event of a failure of one of the capture units, exclusively an assistance function associated with this failed capture unit is deactivated by means of a planning module connected to the monitoring module.
  • the capture units comprise a lidar-based sensor.
  • US 2019/0391270 A1 describes a reflection system for the improvement of an environmental observation by the use of a lidar in the presence of highly reflective surfaces.
  • the reflection system comprises several processors and a memory that communicates with the processors.
  • the reflection system further comprises a scanner module, having commands which, when they are executed by the processors, cause the processors, in reaction to determining that a first point cloud contains an observation of a concealed object which reflects strongly, to control an emission of a scanning light beam with a scanning intensity that is different from an initial intensity of an initial light beam, which is used for the capture of the first point cloud, and to dynamically control the lidar to capture a second point cloud that omits the concealed object.
  • an output module with commands which, when executed by the processors, cause the processors to generate a combined point cloud from the first point cloud and the second point cloud, which improves the observation of the environment when the lidar is used by reducing disturbances caused by the concealed object.
  • Exemplary embodiments of the present invention are directed to disclose a new method and a new device for the recognition of blooming in a lidar measurement.
  • a distance to a lidar reflection point is determined in an active measurement and a passive measurement, wherein a first distance value is determined in the active measurement, based on a signal duration of a laser pulse, and a second distance value is determined in the passive measurement, based on a triangulation of two-dimensional intensity measurements carried out from different measuring positions. Blooming is then identified when the second distance value exceeds the first distance value by a pre-determined amount.
  • a passive measurement by means of two-dimensional intensity measurements is understood as a capture of an environment by means of at least one lidar, in which the at least one lidar captures exclusively light radiation present in the environment, without the active emission of laser radiation.
  • blooming is presently understood as an overexposure or crosstalk in a lidar measurement. Blooming occurs, for example, when a laser pulse emitted from a lidar is reflected from a strongly reflective target, for example from a road sign or a headlight reflector. In this case, a large amount of emitted energy is sent back to the lidar, in comparison with less reflective targets. The light beam sent back is not normally optimally focused. The reasons for this are manifold; it is often the case that the reflection from the target is not optimally reflecting, particles in the atmosphere deflect the laser beam, or soiling on a cover of the lidar causes light scattering.
  • blooming occurs, however, it can lead to incorrect results in a measurement of distances between the lidar and objects captured in its environment.
  • false-positive lidar measurements can occur as a result of blooming effects, such that the exact three-dimensional display of the environment is made more difficult.
  • the passive measurement is based on two two-dimensional intensity measurements, wherein the first intensity measurement is carried out by means of a first lidar, and a second intensity measurement is carried out by means of a second lidar, arranged in a different position from the first lidar.
  • the two two-dimensional intensity measurements are carried out simultaneously, or chronologically one after the other.
  • the passive measurement of the distance can be carried out very quickly.
  • the passive measurement is based on two two-dimensional intensity measurements, wherein a first intensity measurement is carried out by means of a lidar located in a first position, and a second intensity measurement is carried out by means of the same lidar chronologically after the first measurement, and in a second position different from the first position.
  • the passive measurement is executed by the evaluation of two-dimensional intensity images captured in the two-dimensional intensity measurements by means of a stereoscopic method.
  • stereoscopic methods reliably determine the distance to a lidar reflection point, and thus to an object in the environment of the lidar.
  • a semi-global matching algorithm is used as a stereoscopic method, by means of which the determination of the distance to a pixel in the two-dimensional intensity images, and thus to an object in the environment of the lidar, can be executed very reliably and especially precisely.
  • the device for the recognition of blooming in a lidar measurement comprises at least one lidar and, according to the invention, is characterized by a processing unit, which is suitable for determining a distance of the at least one lidar to a lidar reflection point in an active measurement and a passive measurement, to determine a first distance value in the active measurement, based on a signal duration of a laser pulse, to determine a second distance value in the passive measurement, based on a triangulation of two-dimensional intensity measurements, carried out from different measuring positions, and then to identify blooming when the second distance value exceeds the first distance value by a pre-determined amount.
  • a processing unit which is suitable for determining a distance of the at least one lidar to a lidar reflection point in an active measurement and a passive measurement, to determine a first distance value in the active measurement, based on a signal duration of a laser pulse, to determine a second distance value in the passive measurement, based on a triangulation of two-dimensional intensity measurements, carried out from different measuring positions, and then to
  • FIG. 1 schematically, an arrangement of a lidar, and an environment monitored by the lidar
  • FIG. 2 schematically, an arrangement of a lidar at different points in time, and an environment monitored by the lidar
  • FIG. 3 schematically, a lidar image captured by means of a lidar according to FIG. 2 at a first point in time
  • FIG. 4 schematically, a lidar image captured by means of a lidar according to FIG. 2 at a second point in time
  • FIG. 5 schematically, an arrangement of two lidars and an environment monitored by the lidars
  • FIG. 6 schematically, a lidar image captured by means of a first lidar according to FIG. 5 and
  • FIG. 7 schematically, a lidar image captured by means of a second lidar according to FIG. 5 .
  • FIG. 1 an arrangement of a lidar 1 , and an environment monitored by the lidar 1 is displayed.
  • the lidar 1 Within the environment of the lidar 1 are located two objects O 1 , O 2 , which are captured by the lidar 1 within a capture region E.
  • the lidar 1 is, for example, arranged on an automated, especially highly automated or autonomously driving vehicle.
  • the lidar 1 can alternatively be arranged on a robot.
  • the first object O 1 is a highly reflective object O 1 , for example a road sign, for example a motorway sign arranged above a roadway FB.
  • the second object O 2 is located on the roadway, and has a certain level of reflectivity, for example also a lower or higher reflectivity than the first object O 1 .
  • the lidar 1 By means of the lidar 1 , distances to objects O 1 , O 2 in its environment are determined by the emission of laser pulses and the measuring of a time until a reflected laser pulse hits a receiver of the lidar 1 .
  • the lidar 1 can comprise several lasers and/or several receivers in order to increase a measuring rate and a spatial resolution of the lidar 1 .
  • a measurement executed by lidar 1 also referred to as a scan, can be carried out in such a way that a complete scan can be interpreted as a two-dimensional measuring grid, also referred to as a lidar image.
  • the first object O 1 In the displayed environment of the lidar 1 , during the laser measurement, the first object O 1 , because of its high reflectivity, generates blooming points P 1 to Pn above and below the object O 1 at an equal distance, such that a so-called blooming artifact is created. If these blooming points P 1 to Pn are not detected as such, the danger arises that, in a further computation, for example a sensor fusion, it is assumed that an obstacle, for example an end of a traffic jam, is located there, such that unwanted braking may be triggered by a driving assistance system in some cases.
  • a further computation for example a sensor fusion
  • FIG. 2 shows an arrangement of a lidar 1 at different points in time t 1 , t 2 , and an environment monitored by the lidar.
  • a lidar image B 1 captured by means of the lidar 1 according to FIG. 2 at a point in time t 1 is displayed
  • a lidar image B 2 captured by means of the lidar 1 according to FIG. 2 at a second point in time t 2 following the first point in time t 1 , is displayed.
  • the lidar images B 1 , B 2 here respectively display a two-dimensional measuring grid or a two-dimensional intensity image, whose axes show values of a vertical angle ⁇ and values of a horizontal angle ⁇ , such that the vertical angle ⁇ and the horizontal angle ⁇ form image coordinates.
  • the lidar 1 is arranged on a self-moving platform, for example an automated, especially a highly automated or autonomously driving or moved vehicle or robot.
  • distances from objects O 1 , O 2 are determined in the environment of the lidar by means of the lidar 1 , by the emission of laser pulses and the measuring of a time until a reflected laser pulse hits a receiver of the lidar 1 .
  • the reflection is generated at a lidar reflection point R, which belongs to the respective object O 1 , O 2 , for example to a so-called landmark.
  • Lidars 1 are normally considered active sensors, because, according to the description above, they need to actively emit energy to carry out a duration measurement, also known as a time-of-flight measurement. If the receiver of the lidar 1 is sensitive enough, this can also be used to measure an intensity of environmental light of the lidar 1 at a given wavelength of the lidar 1 , which is backscattered to the lidar 1 without active illumination. In this way, by means of the lidar 1 , it is possible to generate a highly dynamic greyscale image of a scene in a passive two-dimensional intensity measurement. Because of a markedly lower intensity of passively reflective light, no blooming effects occur in passive measurements of this kind.
  • Such passive measurements can here be executed immediately before or immediately after the active measurement, such that a recorded scene shows almost no change between the two measurements. While the active measurement delivers an exact three-dimensional display of the environment of the lidar 1 , the passive measurement enables a higher degree of detail for a two-dimensional aspect of an object O 1 , O 2 . In this way, both measuring principles can be supplemented.
  • a distance from the lidar reflection point R is determined in an active measurement and a passive measurement with the assistance of the data captured by the lidar 1 , by a first distance value in the active measurement, based on a signal duration of a laser pulse from the lidar 1 to the lidar reflection point R and back to the lidar 1 .
  • a second distance value is determined in the passive measurement, based on a triangulation of two-dimensional intensity measurements carried out from different measuring positions.
  • Blooming is then identified if the second distance value if the second distance value exceeds the first distance value by a pre-determined amount, especially where it is significantly larger than the first distance value.
  • the passive measurement is based on two two-dimensional intensity measurements, wherein a first intensity measurement is carried out by means of the lidar 1 at the first point in time t 1 , located in a first position, and a second intensity measurement is carried out by means of the same lidar 1 at the second point in time t 2 , a point in time after the first measurement, and in a second position different from the first position.
  • a relative position of the lidar 1 to the lidar reflection point R changes as a result of the movement of the platform.
  • a movement of the lidar 1 between two measurements is, for example, known by evaluating an inertial measuring unit, which is likewise arrayed on the moveable platform, and is calibrated to the lidar 1 or to a shared frame of reference.
  • characteristic positions for example of landmarks
  • An observation of characteristic positions, for example of landmarks, from different perspectives in a lidar image B 1 , B 2 enables an execution of a three-dimensional re-construction of an observed scene. Because of the movement of the lidar 1 , characteristic positions, and so an appurtenant lidar reflection point R, or a pixel recorded in a two-dimensional intensity image displaying it, can appear in different positions in the lidar images B 1 , B 2 , which were captured from different positions in the surrounding environment. This effect is generally described as a movement parallax.
  • the passive measurement is carried out by evaluating the two two-dimensional intensity measurements by means of a stereoscopic method, for example of a semi-global matching algorithm.
  • a generally known stereo matching algorithm for example a semi-global matching algorithm, is used to determine an angle displacement between each pixel in the passive lidar images B 1 , B 2 that are captured from two different perspectives. For example, at point in time t 1 , the lidar 1 sees the reflection point R, or a pixel displaying it, in a vertical angle ⁇ of 10 degrees, and a horizontal angle ⁇ of 5 degrees. At the point in time t 2 , the lidar 1 sees the reflection point R, or the pixel displaying it, in a vertical angle ⁇ of 10 degrees and a horizontal angle ⁇ of 20 degrees.
  • FIG. 5 shows an arrangement of two lidars 1 , 2 and an environment monitored by the lidars 1 , 2 .
  • a lidar image B 1 captured by means of a lidar 1 according to FIG. 5 is displayed
  • a lidar image B 2 captured by means of the other lidar 2 according to the lidar image captured at the same time is displayed.
  • the lidar images B 1 , B 2 respectively represent a two-dimensional measuring grid or a two-dimensional intensity image, whose axes show values of a vertical angle ⁇ and a horizontal angle ⁇ , such that the vertical angle ⁇ and the horizontal angle ⁇ form image coordinates.
  • Both lidars 1 , 2 are arranged on a self-moving platform, for example on an automated, especially a highly automated or an autonomously driving or moved vehicle or robot.
  • the lidars 1 , 2 are also chronologically synchronized, such that these are designed to capture spatial angles of the same kind at the same time.
  • Both lidars 1 , 2 are suitable both for executing an active measurement of distances from the lidar reflection point R, and for executing a passive measurement of intensities.
  • the passive measurement can be carried out immediately before or immediately after the active measurement.
  • a distance to the lidar reflection point R is determined in an active and passive measurement, with the assistance of data recorded by means of the lidar 1 , 2 , by determining a first distance value in the active measurement, based on a signal duration of a laser pulse from the lidar 1 and/or from the lidar 2 to the lidar reflection point R, and back to the lidar 1 and/or the lidar 2 .
  • a second distance value is determined in the passive measurement based on a triangulation of two-dimensional intensity measurements carried out from different measuring positions.
  • Blooming is then identified when the second distance value exceeds the first distance value by a pre-determined amount, especially when it is significantly greater than the first distance value.
  • Extrinsic parameters of the lidars 1 , 2 i.e., their positions and/or alignment, are known.
  • the lidars 1 , 2 are calibrated in relation to one another or to a shared frame of reference.
  • this allows the passive measurement to be based on two two-dimensional intensity measurements, wherein a first intensity measurement is carried out by means of the first lidar, and the second intensity measurement is carried out by means of the second lidar 2 , arranged in a different position from that of the first lidar 1 .
  • the simultaneous recording of the scene from different perspectives by means of the lidars 1 , 2 enables an observation of characteristic positions, for example of landmarks, in the lidar images B 1 , B 2 from the different perspectives, and therefore an execution of a three-dimensional reconstruction of the observed scene.
  • characteristic positions for example of landmarks
  • characteristic positions and therefore an appurtenant lidar reflection point R, or a pixel displaying it, can appear in different positions in the lidar images B 1 , B 2 , which have been captured from different environmental positions.
  • the passive measurement is carried out by evaluation of the two intensity measurements by means of a stereoscopic method, for example a semi-global matching algorithm.
  • a generally known stereo matching algorithm for example a semi-global matching algorithm, is used to determine an angle displacement between each pixel in the passive lidar images B 1 , B 2 that are recorded from two different perspectives.
  • the lidar 1 sees the lidar reflection point R, or a pixel displaying it, in a vertical angle ⁇ of 10 degrees, and a horizontal angle ⁇ of 5 degrees.
  • the lidar 1 sees the lidar reflection point R, or the pixel displaying it, at the same time, for example in a vertical angle ⁇ of 10 degrees and a horizontal angle ⁇ of 20 degrees.
  • lidar reflection point R As a transformation between coordinate systems of both lidars 1 and 2 is known, information about the corresponding position angles is used with the passive intensity measurements executed by means of the lidars 1 , 2 to triangulate three-dimensional coordinates of the measured pixel position, i.e., of lidar reflection point R.

Abstract

Blooming in a lidar measurement is recognized using a distance from a lidar reflection point determined in an active measurement and a passive measurement. A first distance value is determined in the active measurement, based on a signal duration of a laser pulse, and a second distance value is determined in the passive measurement, based on a triangulation of two-dimensional intensity measurements carried out from different measuring positions. Blooming is identified when the second distance value exceeds the first distance value by a pre-determined amount.

Description

    BACKGROUND AND SUMMARY OF THE INVENTION
  • Exemplary embodiments of the invention relate to a method for the recognition of blooming in a lidar measurement, as well as to a device for the recognition of blooming in a lidar measurement having at least one lidar.
  • DE 10 2005 003 970 A1 discloses a method for determining functionality of a sensor arrangement on a motor vehicle, wherein a region captured by the sensor arrangement is divided into different sub-regions, and sensor signals allocated to one sub-region, from one particular surrounding area, are analyzed for determining the functionality of the sensor arrangement. Here, sensor signals, which are captured one after the other for different sub-regions when passing the particular surrounding area, are analyzed. The sub-regions are capture regions of different lidar sensors, or different angular sectors of a lidar sensor.
  • Furthermore, a method for the operation of an assistance system of a vehicle is known from DE 10 2018 003 593 A1, wherein the vehicle is moved in the autonomous driving mode by means of the assistance system, and the assistance system comprises an environment sensor, having a number of capture units arranged in and/or on the vehicle. In the autonomous driving mode of the vehicle, an environment of the vehicle, and objects located within it, are captured by means of the capture units, wherein a function of the individual capture units is constantly monitored by means of a monitoring module, and in the event of a failure of one of the capture units, exclusively an assistance function associated with this failed capture unit is deactivated by means of a planning module connected to the monitoring module. The capture units comprise a lidar-based sensor.
  • US 2019/0391270 A1 describes a reflection system for the improvement of an environmental observation by the use of a lidar in the presence of highly reflective surfaces. The reflection system comprises several processors and a memory that communicates with the processors. The reflection system further comprises a scanner module, having commands which, when they are executed by the processors, cause the processors, in reaction to determining that a first point cloud contains an observation of a concealed object which reflects strongly, to control an emission of a scanning light beam with a scanning intensity that is different from an initial intensity of an initial light beam, which is used for the capture of the first point cloud, and to dynamically control the lidar to capture a second point cloud that omits the concealed object. Furthermore, there is provision for an output module with commands which, when executed by the processors, cause the processors to generate a combined point cloud from the first point cloud and the second point cloud, which improves the observation of the environment when the lidar is used by reducing disturbances caused by the concealed object.
  • Exemplary embodiments of the present invention are directed to disclose a new method and a new device for the recognition of blooming in a lidar measurement.
  • In the method for the recognition of blooming in a lidar measurement, according to the invention, a distance to a lidar reflection point is determined in an active measurement and a passive measurement, wherein a first distance value is determined in the active measurement, based on a signal duration of a laser pulse, and a second distance value is determined in the passive measurement, based on a triangulation of two-dimensional intensity measurements carried out from different measuring positions. Blooming is then identified when the second distance value exceeds the first distance value by a pre-determined amount.
  • Here, a passive measurement by means of two-dimensional intensity measurements is understood as a capture of an environment by means of at least one lidar, in which the at least one lidar captures exclusively light radiation present in the environment, without the active emission of laser radiation.
  • Here, blooming is presently understood as an overexposure or crosstalk in a lidar measurement. Blooming occurs, for example, when a laser pulse emitted from a lidar is reflected from a strongly reflective target, for example from a road sign or a headlight reflector. In this case, a large amount of emitted energy is sent back to the lidar, in comparison with less reflective targets. The light beam sent back is not normally optimally focused. The reasons for this are manifold; it is often the case that the reflection from the target is not optimally reflecting, particles in the atmosphere deflect the laser beam, or soiling on a cover of the lidar causes light scattering. This can cause the light sent back to hit several receiver cells of the lidar which are located spatially near to one another, or the light sent back to transfer to neighboring pixels. The result of this is that—independent of the sensitivity of the detector—a distance measurement is triggered. Blooming effects are also normally stronger when they are shorter distances from the lidar because the amount of energy reflected by a target quickly diminishes with increasing distances which the light must cover.
  • Lidars play an important role in driver assistance systems, and other automatedly operated platforms, for example robots, because they enable an exact three-dimensional display of an environment of the lidar. When blooming occurs, however, it can lead to incorrect results in a measurement of distances between the lidar and objects captured in its environment. In particular, false-positive lidar measurements can occur as a result of blooming effects, such that the exact three-dimensional display of the environment is made more difficult.
  • By means of the method, a reliable identification of blooming in lidar measurements is easily possible, such that incorrect results in such distance measurements can be avoided, or at least certainly recognized. A safer operation of applications results from this, for example of automated, especially highly-automated or autonomous driving or moved vehicles and robots.
  • In a possible embodiment of the method, the passive measurement is based on two two-dimensional intensity measurements, wherein the first intensity measurement is carried out by means of a first lidar, and a second intensity measurement is carried out by means of a second lidar, arranged in a different position from the first lidar. This enables a simple and reliable execution of the passive measurement and, consequently, an especially reliable recognition of blooming.
  • In a further possible embodiment of the method, the two two-dimensional intensity measurements are carried out simultaneously, or chronologically one after the other. In particular when intensity measurements are carried out simultaneously, the passive measurement of the distance can be carried out very quickly.
  • In a further possible embodiment of the method, the passive measurement is based on two two-dimensional intensity measurements, wherein a first intensity measurement is carried out by means of a lidar located in a first position, and a second intensity measurement is carried out by means of the same lidar chronologically after the first measurement, and in a second position different from the first position. This enables a simple and reliable execution of the passive measurement and, consequently, an especially reliable recognition of blooming, wherein only one lidar is necessary for the execution of the two-dimensional intensity measurements, which leads to particularly low employment of hardware and costs.
  • In a further possible embodiment of the method, the passive measurement is executed by the evaluation of two-dimensional intensity images captured in the two-dimensional intensity measurements by means of a stereoscopic method. Such stereoscopic methods reliably determine the distance to a lidar reflection point, and thus to an object in the environment of the lidar.
  • In a further possible embodiment of the method, a semi-global matching algorithm is used as a stereoscopic method, by means of which the determination of the distance to a pixel in the two-dimensional intensity images, and thus to an object in the environment of the lidar, can be executed very reliably and especially precisely.
  • The device for the recognition of blooming in a lidar measurement comprises at least one lidar and, according to the invention, is characterized by a processing unit, which is suitable for determining a distance of the at least one lidar to a lidar reflection point in an active measurement and a passive measurement, to determine a first distance value in the active measurement, based on a signal duration of a laser pulse, to determine a second distance value in the passive measurement, based on a triangulation of two-dimensional intensity measurements, carried out from different measuring positions, and then to identify blooming when the second distance value exceeds the first distance value by a pre-determined amount.
  • By use of the device, a reliable recognition of blooming in lidar measurements is easily possible, such that incorrect results in distance measurements executed by means of a lidar can be avoided, or at least certainly recognized. From this results a safer operation of applications, for example of automated, especially highly-automated or autonomous driving or moved vehicles and robots.
  • Exemplary embodiments of the invention are illustrated in more detail in the following with the assistance of drawings.
  • BRIEF DESCRIPTION OF THE DRAWING FIGURES
  • Shown are:
  • FIG. 1 schematically, an arrangement of a lidar, and an environment monitored by the lidar,
  • FIG. 2 schematically, an arrangement of a lidar at different points in time, and an environment monitored by the lidar,
  • FIG. 3 schematically, a lidar image captured by means of a lidar according to FIG. 2 at a first point in time,
  • FIG. 4 schematically, a lidar image captured by means of a lidar according to FIG. 2 at a second point in time,
  • FIG. 5 schematically, an arrangement of two lidars and an environment monitored by the lidars,
  • FIG. 6 schematically, a lidar image captured by means of a first lidar according to FIG. 5 and
  • FIG. 7 schematically, a lidar image captured by means of a second lidar according to FIG. 5 .
  • Parts corresponding to one another are labelled in all figures with the same reference numerals.
  • DETAILED DESCRIPTION
  • In FIG. 1 , an arrangement of a lidar 1, and an environment monitored by the lidar 1 is displayed.
  • Within the environment of the lidar 1 are located two objects O1, O2, which are captured by the lidar 1 within a capture region E.
  • The lidar 1 is, for example, arranged on an automated, especially highly automated or autonomously driving vehicle. The lidar 1 can alternatively be arranged on a robot.
  • The first object O1 is a highly reflective object O1, for example a road sign, for example a motorway sign arranged above a roadway FB. The second object O2 is located on the roadway, and has a certain level of reflectivity, for example also a lower or higher reflectivity than the first object O1.
  • By means of the lidar 1, distances to objects O1, O2 in its environment are determined by the emission of laser pulses and the measuring of a time until a reflected laser pulse hits a receiver of the lidar 1. Here, the lidar 1 can comprise several lasers and/or several receivers in order to increase a measuring rate and a spatial resolution of the lidar 1. Here, a measurement executed by lidar 1, also referred to as a scan, can be carried out in such a way that a complete scan can be interpreted as a two-dimensional measuring grid, also referred to as a lidar image.
  • In the displayed environment of the lidar 1, during the laser measurement, the first object O1, because of its high reflectivity, generates blooming points P1 to Pn above and below the object O1 at an equal distance, such that a so-called blooming artifact is created. If these blooming points P1 to Pn are not detected as such, the danger arises that, in a further computation, for example a sensor fusion, it is assumed that an obstacle, for example an end of a traffic jam, is located there, such that unwanted braking may be triggered by a driving assistance system in some cases.
  • FIG. 2 shows an arrangement of a lidar 1 at different points in time t1, t2, and an environment monitored by the lidar. In FIG. 3 , a lidar image B1 captured by means of the lidar 1 according to FIG. 2 at a point in time t1 is displayed, and in FIG. 4 , a lidar image B2, captured by means of the lidar 1 according to FIG. 2 at a second point in time t2 following the first point in time t1, is displayed. The lidar images B1, B2 here respectively display a two-dimensional measuring grid or a two-dimensional intensity image, whose axes show values of a vertical angle α and values of a horizontal angle β, such that the vertical angle α and the horizontal angle β form image coordinates.
  • The lidar 1 is arranged on a self-moving platform, for example an automated, especially a highly automated or autonomously driving or moved vehicle or robot.
  • As described above, distances from objects O1, O2 are determined in the environment of the lidar by means of the lidar 1, by the emission of laser pulses and the measuring of a time until a reflected laser pulse hits a receiver of the lidar 1. Here, the reflection is generated at a lidar reflection point R, which belongs to the respective object O1, O2, for example to a so-called landmark.
  • Lidars 1 are normally considered active sensors, because, according to the description above, they need to actively emit energy to carry out a duration measurement, also known as a time-of-flight measurement. If the receiver of the lidar 1 is sensitive enough, this can also be used to measure an intensity of environmental light of the lidar 1 at a given wavelength of the lidar 1, which is backscattered to the lidar 1 without active illumination. In this way, by means of the lidar 1, it is possible to generate a highly dynamic greyscale image of a scene in a passive two-dimensional intensity measurement. Because of a markedly lower intensity of passively reflective light, no blooming effects occur in passive measurements of this kind. Such passive measurements can here be executed immediately before or immediately after the active measurement, such that a recorded scene shows almost no change between the two measurements. While the active measurement delivers an exact three-dimensional display of the environment of the lidar 1, the passive measurement enables a higher degree of detail for a two-dimensional aspect of an object O1, O2. In this way, both measuring principles can be supplemented.
  • The lidar 1, both displayed and arranged on the self-moving platform, is suitable both for carrying out an active measurement of distances from the lidar reflection point R, and for carrying out a passive measurement of intensities. Here, the passive measurement can be carried out either immediately before or immediately after the active measurement.
  • For determining blooming in a lidar measurement, a distance from the lidar reflection point R is determined in an active measurement and a passive measurement with the assistance of the data captured by the lidar 1, by a first distance value in the active measurement, based on a signal duration of a laser pulse from the lidar 1 to the lidar reflection point R and back to the lidar 1.
  • Subsequently, a second distance value is determined in the passive measurement, based on a triangulation of two-dimensional intensity measurements carried out from different measuring positions.
  • Blooming is then identified if the second distance value if the second distance value exceeds the first distance value by a pre-determined amount, especially where it is significantly larger than the first distance value.
  • Here, the passive measurement is based on two two-dimensional intensity measurements, wherein a first intensity measurement is carried out by means of the lidar 1 at the first point in time t1, located in a first position, and a second intensity measurement is carried out by means of the same lidar 1 at the second point in time t2, a point in time after the first measurement, and in a second position different from the first position. Between the two points in times t1, t2, a relative position of the lidar 1 to the lidar reflection point R changes as a result of the movement of the platform.
  • Here, a movement of the lidar 1 between two measurements is, for example, known by evaluating an inertial measuring unit, which is likewise arrayed on the moveable platform, and is calibrated to the lidar 1 or to a shared frame of reference.
  • An observation of characteristic positions, for example of landmarks, from different perspectives in a lidar image B1, B2, enables an execution of a three-dimensional re-construction of an observed scene. Because of the movement of the lidar 1, characteristic positions, and so an appurtenant lidar reflection point R, or a pixel recorded in a two-dimensional intensity image displaying it, can appear in different positions in the lidar images B1, B2, which were captured from different positions in the surrounding environment. This effect is generally described as a movement parallax. If the movement of the lidar 1 between the two points in times t1, t2 is known, and a position of one and the same lidar reflection point R is found in both lidar images B1, B2, then a three-dimensional position, and therefore a distance to the lidar reflection point R, can be reconstructed through simple triangulation.
  • For example, the passive measurement is carried out by evaluating the two two-dimensional intensity measurements by means of a stereoscopic method, for example of a semi-global matching algorithm.
  • A possible exemplary embodiment of a method for the recognition of blooming in a lidar measurement is described in the following.
  • Initially, a generally known stereo matching algorithm, for example a semi-global matching algorithm, is used to determine an angle displacement between each pixel in the passive lidar images B1, B2 that are captured from two different perspectives. For example, at point in time t1, the lidar 1 sees the reflection point R, or a pixel displaying it, in a vertical angle α of 10 degrees, and a horizontal angle β of 5 degrees. At the point in time t2, the lidar 1 sees the reflection point R, or the pixel displaying it, in a vertical angle α of 10 degrees and a horizontal angle β of 20 degrees.
  • As a three-dimensional movement of the lidar 1 between the capture of both lidar images B1, B2 is known, information about the corresponding position angles from the first and second measurement can be used to triangulate three-dimensional coordinates of the measured pixel position, namely of the lidar reflection point R.
  • A comparison of the active measurement at a random pixel site with the passive measurement, which has been derived from the triangulation described, now enables conclusions to be drawn about the presence of blooming. If the passive measurement derived with the structure-from-motion algorithm yields a significantly larger measurement than the active measurement, then blooming can be identified as a plausible explanation for this.
  • FIG. 5 shows an arrangement of two lidars 1, 2 and an environment monitored by the lidars 1,2. In FIG. 6 , a lidar image B1 captured by means of a lidar 1 according to FIG. 5 is displayed, and in FIG. 7 a lidar image B2 captured by means of the other lidar 2 according to the lidar image captured at the same time is displayed. Here, the lidar images B1, B2 respectively represent a two-dimensional measuring grid or a two-dimensional intensity image, whose axes show values of a vertical angle α and a horizontal angle β, such that the vertical angle α and the horizontal angle β form image coordinates.
  • Both lidars 1, 2 are arranged on a self-moving platform, for example on an automated, especially a highly automated or an autonomously driving or moved vehicle or robot. The lidars 1, 2 are also chronologically synchronized, such that these are designed to capture spatial angles of the same kind at the same time.
  • Both lidars 1, 2 are suitable both for executing an active measurement of distances from the lidar reflection point R, and for executing a passive measurement of intensities. Here, the passive measurement can be carried out immediately before or immediately after the active measurement.
  • For determining blooming in a lidar measurement, in this exemplary embodiment, a distance to the lidar reflection point R is determined in an active and passive measurement, with the assistance of data recorded by means of the lidar 1, 2, by determining a first distance value in the active measurement, based on a signal duration of a laser pulse from the lidar 1 and/or from the lidar 2 to the lidar reflection point R, and back to the lidar 1 and/or the lidar 2.
  • Subsequently, a second distance value is determined in the passive measurement based on a triangulation of two-dimensional intensity measurements carried out from different measuring positions.
  • Blooming is then identified when the second distance value exceeds the first distance value by a pre-determined amount, especially when it is significantly greater than the first distance value.
  • Extrinsic parameters of the lidars 1, 2, i.e., their positions and/or alignment, are known. For this purpose, the lidars 1, 2 are calibrated in relation to one another or to a shared frame of reference.
  • In contrast with the exemplary embodiment described in relation to FIGS. 2 to 4 , this allows the passive measurement to be based on two two-dimensional intensity measurements, wherein a first intensity measurement is carried out by means of the first lidar, and the second intensity measurement is carried out by means of the second lidar 2, arranged in a different position from that of the first lidar 1.
  • The simultaneous recording of the scene from different perspectives by means of the lidars 1, 2 enables an observation of characteristic positions, for example of landmarks, in the lidar images B1, B2 from the different perspectives, and therefore an execution of a three-dimensional reconstruction of the observed scene. As a result of the different positions of the lidars 1, 2, characteristic positions, and therefore an appurtenant lidar reflection point R, or a pixel displaying it, can appear in different positions in the lidar images B1, B2, which have been captured from different environmental positions. As the relative position of the lidars 1, 2 in relation to each other and their extrinsic parameters are known, then if a position of one and the same lidar reflection point R, or of the pixel displaying it, can be found in both lidar images B1, B2, a three-dimensional position, and thus a distance from a lidar reflection point R, can be reconstructed by simple triangulation.
  • For example, the passive measurement is carried out by evaluation of the two intensity measurements by means of a stereoscopic method, for example a semi-global matching algorithm.
  • A possible exemplary embodiment of a method for the recognition of blooming in a lidar measurement is described in the following.
  • Firstly, a generally known stereo matching algorithm, for example a semi-global matching algorithm, is used to determine an angle displacement between each pixel in the passive lidar images B1, B2 that are recorded from two different perspectives. For example, the lidar 1 sees the lidar reflection point R, or a pixel displaying it, in a vertical angle α of 10 degrees, and a horizontal angle β of 5 degrees. The lidar 1 sees the lidar reflection point R, or the pixel displaying it, at the same time, for example in a vertical angle α of 10 degrees and a horizontal angle β of 20 degrees.
  • As a transformation between coordinate systems of both lidars 1 and 2 is known, information about the corresponding position angles is used with the passive intensity measurements executed by means of the lidars 1, 2 to triangulate three-dimensional coordinates of the measured pixel position, i.e., of lidar reflection point R.
  • A comparison of the active measurement at a random pixel site with the passive measurement, which has been derived from the triangulation described, now enables conclusions to be drawn about the presence of blooming. If the passive measurement yields a significantly larger distance than the active measurement, then blooming can be identified as a plausible explanation for this.
  • Although the invention has been illustrated and described in detail by way of preferred embodiments, the invention is not limited by the examples disclosed, and other variations can be derived from these by the person skilled in the art without leaving the scope of the invention. It is therefore clear that there is a plurality of possible variations. It is also clear that embodiments stated by way of example are only really examples that are not to be seen as limiting the scope, application possibilities or configuration of the invention in any way. In fact, the preceding description and the description of the figures enable the person skilled in the art to implement the exemplary embodiments in concrete manner, wherein, with the knowledge of the disclosed inventive concept, the person skilled in the art is able to undertake various changes, for example, with regard to the functioning or arrangement of individual elements stated in an exemplary embodiment without leaving the scope of the invention, which is defined by the claims and their legal equivalents, such as further explanations in the description.
  • LIST OF REFERENCE NUMERALS
    • 1 Lidar
    • 2 Lidar
    • B1 Lidar image
    • B2 Lidar image
    • E Capture region
    • FB Roadway
    • O1 Object 1
    • O2 Object 2
    • P1 to Pn Blooming point
    • R Lidar reflection point
    • t1 Point in time
    • t2 Point in time
    • α Angle
    • β Angle

Claims (8)

1-7. (canceled)
8. A method for recognizing blooming in a lidar measurement, the method comprising:
determining a distance to a lidar reflection point using an active measurement and a passive measurement, wherein the determined distance includes a first distance value determined in the active measurement based on a signal duration of a laser pulse, and wherein the determined distance includes a second distance value determined in the passive measurement based on a triangulation of two-dimensional intensity measurements performed from different measuring positions; and
identifying blooming in the lidar measurement when the second distance value exceeds the first distance value by a predetermined amount.
9. The method of claim 8, wherein
the passive measurement is based on first and second two-dimensional intensity measurements,
the first two-dimensional intensity measurement is performed by a first lidar; and
the second two-dimensional intensity measurement performed by a second lidar arranged in a different position from the first lidar.
10. The method of claim 9, wherein the first and second passive two-dimensional intensity measurements performed simultaneously or chronologically one after the other.
11. The method of claim 8, wherein
the passive measurement is based on first and second two-dimensional intensity measurements,
the first two-dimensional intensity measurement is performed by a lidar located in a first position, and
the second two-dimensional intensity measurement is performed by the same lidar at a point in time after the first measurement and in a second position that is different from the first position.
12. The method of claim 9, wherein the passive measurement is performed using a stereoscopic method to evaluate two-dimensional intensity images captured by the first and second two-dimensional intensity measurements.
13. The method of claim 12, wherein the stereoscopic method involves a semi-global matching algorithm.
14. A device for recognizing blooming in a lidar measurement, the device comprising:
at least one lidar; and
a processing unit configured to
determine a distance to a lidar reflection point using an active measurement and a passive measurement, wherein the determined distance includes a first distance value determined in the active measurement based on a signal duration of a laser pulse, and wherein the determined distance includes a second distance value determined in the passive measurement based on a triangulation of two-dimensional intensity measurements performed from different measuring positions; and
identify blooming in the lidar measurement when the second distance value exceeds the first distance value by a predetermined amount.
US17/920,071 2020-04-21 2021-03-31 Method and device for the recognition of blooming in a lidar measurement Active US11619725B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102020110809.5 2020-04-21
DE102020110809.5A DE102020110809B3 (en) 2020-04-21 2020-04-21 Method and device for recognizing blooming in a lidar measurement
PCT/EP2021/058407 WO2021213788A1 (en) 2020-04-21 2021-03-31 Method and device for identifying blooming in a lidar measurement

Publications (2)

Publication Number Publication Date
US11619725B1 US11619725B1 (en) 2023-04-04
US20230122788A1 true US20230122788A1 (en) 2023-04-20

Family

ID=75396750

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/920,071 Active US11619725B1 (en) 2020-04-21 2021-03-31 Method and device for the recognition of blooming in a lidar measurement

Country Status (7)

Country Link
US (1) US11619725B1 (en)
EP (1) EP4139709A1 (en)
JP (1) JP7348414B2 (en)
KR (1) KR20220146617A (en)
CN (1) CN115485582A (en)
DE (1) DE102020110809B3 (en)
WO (1) WO2021213788A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020128732A1 (en) 2020-11-02 2022-05-05 Daimler Ag Method and device for detecting blooming candidates in a lidar measurement
CN114371483B (en) * 2022-03-21 2022-06-10 深圳市欢创科技有限公司 Laser radar ranging method and device, laser radar and robot

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200064452A1 (en) * 2018-08-24 2020-02-27 Velodyne Lidar, Inc. Systems and methods for mitigating optical crosstalk in a light ranging and detection system
US20210183016A1 (en) * 2017-09-01 2021-06-17 Sony Corporation Image processing apparatus, image processing method, program, and moving body

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005003970A1 (en) 2005-01-27 2006-08-03 Daimlerchrysler Ag Sensor arrangement`s operability determining method for motor vehicle, involves detecting sensor signals successively for different sub-regions while passing at surrounding region, and determining number of measuring events in latter region
EP3550329A1 (en) 2018-04-04 2019-10-09 Xenomatix NV System and method for determining a distance to an object
DE102018003593A1 (en) 2018-05-04 2018-10-25 Daimler Ag Method for operating an assistance system of a vehicle
US11041957B2 (en) 2018-06-25 2021-06-22 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for mitigating effects of high-reflectivity objects in LiDAR data
JP2021148746A (en) 2020-03-23 2021-09-27 株式会社リコー Distance measuring device and distance measuring method
WO2022185780A1 (en) * 2021-03-03 2022-09-09 ソニーグループ株式会社 Information processing device, information processing method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210183016A1 (en) * 2017-09-01 2021-06-17 Sony Corporation Image processing apparatus, image processing method, program, and moving body
US20200064452A1 (en) * 2018-08-24 2020-02-27 Velodyne Lidar, Inc. Systems and methods for mitigating optical crosstalk in a light ranging and detection system

Also Published As

Publication number Publication date
DE102020110809B3 (en) 2021-10-21
US11619725B1 (en) 2023-04-04
JP7348414B2 (en) 2023-09-20
EP4139709A1 (en) 2023-03-01
KR20220146617A (en) 2022-11-01
CN115485582A (en) 2022-12-16
JP2023515267A (en) 2023-04-12
WO2021213788A1 (en) 2021-10-28

Similar Documents

Publication Publication Date Title
US11719788B2 (en) Signal processing apparatus, signal processing method, and program
US11513212B2 (en) Motor vehicle and method for a 360° detection of the motor vehicle surroundings
US10048381B2 (en) Opto-electronic detection device and method for sensing the surroundings of a motor vehicle by scanning
EP2910971B1 (en) Object recognition apparatus and object recognition method
KR102020037B1 (en) Hybrid LiDAR scanner
US20210125487A1 (en) Methods and systems for detecting intrusions in a monitored volume
US11619725B1 (en) Method and device for the recognition of blooming in a lidar measurement
KR20190074769A (en) Apparatus for Light Detection and Ranging
CN112130158A (en) Object distance measuring device and method
CN110954912B (en) Method and apparatus for optical distance measurement
JP2019128350A (en) Image processing method, image processing device, on-vehicle device, moving body and system
US11614528B2 (en) Setting method of monitoring system and monitoring system
US20220365219A1 (en) Pixel Mapping Solid-State LIDAR Transmitter System and Method
KR101868293B1 (en) Apparatus for Providing Vehicle LIDAR
US20220171030A1 (en) Lidar measuring system with two lidar measuring devices
US20220091236A1 (en) Techniques for detecting and mitigating interference among multiple lidar sensors
US20220137218A1 (en) Detecting Retroreflectors in NIR Images to Control LIDAR Scan
US20230408702A1 (en) Method and device for identifying blooming candidates in a lidar measurement
US11815626B2 (en) Method for detecting intensity peaks of a specularly reflected light beam
US11543493B2 (en) Distance measuring unit
US20220179077A1 (en) Method for supplementary detection of objects by a lidar system
US9972098B1 (en) Remote distance estimation system and method
KR20230113343A (en) Active sensor systems and object detection
WO2023152422A1 (en) Light-emitting device
CN113050073A (en) Reference plane calibration method, obstacle detection method and distance detection device

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: MERCEDES-BENZ GROUP AG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MEINKE, MARTIN;PETER, DAVID;BUCK, SEBASTIAN;SIGNING DATES FROM 20221013 TO 20221020;REEL/FRAME:062012/0162

STCF Information on status: patent grant

Free format text: PATENTED CASE