WO2023247304A1 - Procédé de fonctionnement d'un système lidar, système lidar, et véhicule comprenant au moins un système lidar - Google Patents

Procédé de fonctionnement d'un système lidar, système lidar, et véhicule comprenant au moins un système lidar Download PDF

Info

Publication number
WO2023247304A1
WO2023247304A1 PCT/EP2023/066060 EP2023066060W WO2023247304A1 WO 2023247304 A1 WO2023247304 A1 WO 2023247304A1 EP 2023066060 W EP2023066060 W EP 2023066060W WO 2023247304 A1 WO2023247304 A1 WO 2023247304A1
Authority
WO
WIPO (PCT)
Prior art keywords
reception
oversaturated
distance
area
areas
Prior art date
Application number
PCT/EP2023/066060
Other languages
German (de)
English (en)
Inventor
Hansjoerg Schmidt
Johannes Michael
Christoph Parl
Thorsten BEUTH
Original Assignee
Valeo Detection Systems GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Detection Systems GmbH filed Critical Valeo Detection Systems GmbH
Publication of WO2023247304A1 publication Critical patent/WO2023247304A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/493Extracting wanted echo signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4915Time delay measurement, e.g. operational details for pixel components; Phase measurement

Definitions

  • the invention relates to a method for operating a LiDAR system, in particular a LiDAR system of a vehicle, in which at least one electromagnetic scanning beam is sent into at least one monitoring area using at least one transmitting device of the LiDAR system, at least one electromagnetic beam coming from the at least one monitoring area Reception beam, which comes from the at least one electromagnetic scanning beam reflected on at least one object target, is received with at least one of several reception areas of a reception matrix of a reception device of the LiDAR system, at least a part of the at least one received electromagnetic reception beam is converted into reception quantities, the reception quantities the reception areas that are hit by the at least one electromagnetic reception beam are each assigned, it is determined whether at least one reception area has been oversaturated by the at least one incident reception beam, for at least a part of the non-supersaturated reception areas hit by the at least one electromagnetic reception beam At least one distance variable is determined in at least part of the respective reception variables, which characterizes a distance of the at least one object target to the corresponding reception area on which the at
  • the invention further relates to a LiDAR system, in particular a LiDAR system of a vehicle, with at least one transmitting device with which electromagnetic scanning beams can be sent into at least one monitoring area, with at least one receiving device, which has at least one receiving matrix with a plurality of receiving areas, with which electromagnetic reception beams coming from the at least one monitoring area, which originate from electromagnetic scanning beams reflected on at least one object target, can be received, and which means for converting electromagnetic reception beams into reception quantities which can be assigned to the reception areas hit by the corresponding electromagnetic reception beams, the LiDAR system having means for determining reception areas that are oversaturated by the incident reception beams, the LiDAR system having means for determining distance sizes for non-supersaturated reception areas, which distances of at least an object target to the reception areas, and the LiDAR system has means for determining substitute distance sizes for oversaturated reception areas.
  • a LiDAR system in particular a LiDAR system of a vehicle, with at least one transmitting device with which electromagnetic scanning beam
  • the invention also relates to a vehicle having at least one LiDAR system, with at least one transmitting device, with which electromagnetic scanning beams can be sent into at least one monitoring area, with at least one receiving device, which has at least one receiving matrix with several receiving areas, with which from the at least one Electromagnetic reception beams coming from the monitoring area, which come from electromagnetic scanning beams reflected on at least one object target, can be received, and which have means for converting electromagnetic reception beams into reception quantities which can be assigned to the reception areas hit by the corresponding electromagnetic reception beams, the LiDAR system having means for determining reception areas that are oversaturated by the incident reception beams, the LiDAR system has means for determining distance sizes for non-oversaturated reception areas, which characterizes distances of the at least one object target to the reception areas, and the LiDAR system has means for determining replacement distance sizes for saturated reception areas.
  • a distance measuring device is known from US 2020/0011972 A1.
  • the distance measuring device includes a TOF camera that measures the distance to the object using the TOF method and outputs distance data, a saturation detection unit that detects that a light receiving level (accumulated charge) of an image sensor is saturated in a light receiving unit in the TOF camera, an interpolation processing unit that stores distance data in a non-saturation area in a memory, reads the distance data in the non-saturation area, and performs an interpolation process of distance data of a saturation area, and an image processing unit that performs a coloring process for changing a color of an object position based on the distance data after the interpolation process and outputs a distance image.
  • the TOF camera includes the light-emitting unit and the light-receiving unit, and emits radiation light for measuring the distance from the light-emitting unit to the object.
  • the light receiving unit receives the light reflected from the object through an objective lens and outputs an amount of charge accumulated in each pixel position as a signal via an image sensor in which the pixels are arranged in a two-dimensional shape, e.g. B. a CCD.
  • the invention is based on the object of designing a method, a LiDAR system and a vehicle of the type mentioned at the outset, in which the determination of distance variables can be further improved, in particular errors and / or gaps in the determination of distance variables due to oversaturation Reception areas can be corrected.
  • the object is achieved in the method in that if several reception areas are recognized as oversaturated and at least some of the oversaturated reception areas form at least a partially coherent saturation field, at least three non-supersaturated reception areas, which are located in a ring of non-supersaturated reception areas, which surrounds the at least one saturation field, are identified as ring reception areas, respective ring coordinates of a coordinate system are determined for at least three of the ring reception areas, the ring -Coordinates have at least one reception matrix coordinate, which represents the position of the respective ring reception area within the reception matrix, and at least one distance size coordinate, which represents the respective determined distance size, from the ring coordinates of at least three of the ring reception areas
  • Approximation surface is realized in the coordinate system, for at least one supersaturated reception area, the position of the at least one supersaturated reception area within the reception matrix being represented with at least one corresponding reception matrix coordinate, an approach point is determined on the proximity surface, which has the same at least one reception matrix coordinate
  • an approximation surface is generated using coordinates of non-supersaturated reception areas, which surround a saturation field of supersaturated reception areas.
  • the approximation surface approximates the coordinates of oversaturated reception areas.
  • the proximity points on the proximity surface are characterized by at least one reception matrix coordinate and at least one proximity distance coordinate.
  • the reception matrix coordinate of an approach point enables an assignment to a reception area with the same reception matrix coordinates.
  • the at least one receiving matrix coordinate can have at least one row coordinate and/or a column coordinate of the receiving matrix, in particular an x coordinate and/or a y coordinate of a Cartesian coordinate system.
  • the proximity-distance magnitude coordinate gives the further coordinate of the approach point in the two-dimensional or three-dimensional coordinate system, which can be assigned to a distance dimension of the coordinate system.
  • the distance dimension indicates the distance sizes for field of view areas in reception area fields of view of the reception areas. It is a two-dimensional coordinate system if the receiving matrix is implemented as a line. If the receiving matrix is realized as a surface, the coordinate system is three-dimensional.
  • the ring coordinates can each indicate a field of view area in a reception area field of view of the respective ring reception area at a distance which is characterized by the distance size.
  • a reception area field of view is the field of view of a reception area. Object targets that are located in a field of view area of the reception area field of view can be detected with the corresponding reception area.
  • a reception area field of view can be characterized by a direction, in particular at least one directional angle, in which the reception area field of view extends relative to a reference point of the LiDAR system.
  • a field of view area is a region of a reception area field of view at a certain distance relative to a reference point, in particular a reference point of the LiDAR system. The distance is characterized with a distance size.
  • a reference point of the LiDAR system can lie in the plane of the reception matrix. In this way, distances, directions and/or speeds of detected object targets can be characterized relative to the reception matrix.
  • the reference point can also be implemented elsewhere in the LiDAR system.
  • the reference point can also be implemented outside the LiDAR system.
  • a corresponding proximity point is determined on the proximity surface, which has the same at least one reception matrix coordinate as the at least one oversaturated reception area.
  • at least one proximity distance variable coordinate of the proximity point is determined for the at least one oversaturated reception area.
  • the direction of a vector between the at least one supersaturated reception area and the corresponding proximity point on the proximity surface is defined.
  • the approach point with the same at least one reception matrix coordinate lies in the reception area field of view of the at least one oversaturated reception area.
  • the approach point is used to approach the object target from which the reception beam leading to oversaturation of the reception area comes.
  • the length of the vector can be determined using the approximate distance coordinate.
  • the length of the vector can be used to determine the proximity distance size, which is used for the highly reflective object target instead of a distance size that cannot be determined due to the oversaturation of the reception area.
  • oversaturation of reception areas can occur due to strong electromagnetic reception beams, which come from scanning beams that are reflected on highly reflective object targets, in particular retroreflective object targets. No valid distance value can be determined from the reception sizes of the oversaturated reception areas. Oversaturation of a reception area can be identified, in particular, when reception beams are received.
  • the oversaturation of reception areas also leads to a distortion of the reception quantities in non-oversaturated reception areas in the vicinity of the oversaturated reception areas. It has been shown that the distance between the highly reflective object target and the supersaturated reception areas hit by the at least one reception beam can be characterized using the distance sizes of the non-supersaturated reception areas in a ring around the supersaturated reception areas.
  • the reception areas in the vicinity of oversaturated reception areas experience a radiation effect that arises from oversaturated reception beams.
  • the radiation effect in the ring around oversaturated reception areas is used according to the invention to determine the distance size for the distance of the object target that causes the oversaturation. In this way, even with integration times that are long enough to detect weakly reflective object targets in the at least one monitoring area with the LiDAR system, the distance sizes can also be determined for highly reflective object targets, which lead to oversaturation.
  • the proximity area is determined using the ring coordinates of non-oversaturated reception areas in a ring around a saturation field with several oversaturated reception areas.
  • orientations of objects with object targets that lead to oversaturation can also be determined relative to the LiDAR system.
  • the invention can be used to determine the distance for each object target on the surface more precisely.
  • a distance coordinate in the sense of the invention is a coordinate which is determined from a received reception beam.
  • the distance size coordinate can have at least one reception size.
  • a distance size coordinate can be used to characterize a distance of an object target from which the at least one reception beam comes, relative to the reception area correspondingly hit.
  • An approximation surface is also known as a “fit plane” or “best-fit plane”.
  • known approximation methods plane fit method
  • Each proximity point of the proximity surface can be characterized with a data set, in particular a coordinate set, which has at least one reception matrix coordinate, which enables the proximity point to be assigned to a reception area, and an proximity distance variable coordinate.
  • the proximity-distance coordinate characterizes a virtual distance of the proximity point to the reception area.
  • the virtual distance of the proximity Point corresponds to the distance of a virtual object target to the reception area, the reception beams coming from the virtual object target and received with the reception area causing a distance size coordinate whose value corresponds to the value of the proximity distance size coordinate of the proximity point.
  • a direction of the at least one reflecting object target relative to the reception matrix can be determined via the position of the reception areas within the reception matrix hit by the at least one electromagnetic reception beam.
  • the reception matrix With the reception matrix, a spatially resolved LiDAR measurement is possible.
  • the reception matrix can be a one-dimensional reception matrix.
  • a spatial resolution can be achieved in two dimensions, in particular in the horizontal or vertical direction and in the distance direction.
  • the reception matrix can advantageously be a two-dimensional reception matrix.
  • the position of the at least one object target can be determined in three dimensions, in particular in the horizontal direction, in the vertical direction and in the distance direction.
  • the at least one electromagnetic scanning beam may comprise or consist of an electromagnetic scanning signal.
  • additional information can be transmitted with the at least one scanning beam.
  • the at least one scanning signal can be coded. In this way, the assignment of the at least one scanning beam on the receiver side can be simplified.
  • a distance variable in the sense of the invention is a variable that characterizes a distance of a detected object target to a reference point, in particular a reference point of the LiDAR system.
  • a distance quantity can be a shift, in particular a phase shift, between a transmitted scanning beam in the form of an amplitude-modulated scanning beam and the corresponding electromagnetic received beam.
  • a distance quantity can be the amount of a geometric length.
  • An object target in the sense of the invention is a location on an object at which electromagnetic scanning beams, which are sent by the LiDAR system into the at least one monitoring area, can be reflected. Each object can have multiple object targets.
  • electromagnetic scanning beams in the form of laser beams in particular laser signals, can be sent with the at least one transmitting device.
  • Laser beams can be precisely defined and emitted over a long range, especially several hundred meters.
  • the LiDAR system can be a flash LiDAR system.
  • each transmitted electromagnetic scanning beam - similar to a flash - emits large parts of the at least one monitoring area, in particular the entire monitoring area.
  • the LiDAR system can advantageously be designed as a laser-based distance measuring system.
  • a laser-based distance measuring system can have at least one laser, in particular a diode laser, as the light source of a transmitting device.
  • pulsed scanning beams can be sent with the at least one laser.
  • the laser can be used to emit scanning beams in wavelength ranges that are visible or invisible to the human eye.
  • the receiving matrix of the receiving device can be implemented with at least one sensor designed for the wavelength of the emitted scanning beam, in particular a CCD sensor, an active pixel sensor, in particular a CMOS sensor or the like.
  • Such sensors have a plurality of reception areas, in particular pixels or groups of pixels. Such sensors can be operated in such a way that the received variables determined can be assigned to the respective reception areas.
  • the method and the LiDAR system can advantageously be used in vehicles, in particular motor vehicles.
  • the method and the LiDAR system can advantageously be used in land vehicles, in particular passenger cars, trucks, buses, motorcycles or the like, aircraft, in particular Drones, and/or watercraft can be used.
  • the method and the Li-DAR system can also be used in vehicles that can be operated autonomously or at least partially autonomously.
  • the method and the LiDAR system are not limited to vehicles. They can also be used in stationary operation, in robotics and/or in machines, in particular construction or transport machines, such as cranes, excavators or the like.
  • the LiDAR system can have a processor device, in particular an electronic control and evaluation device, with which the LiDAR system can be controlled and received variables can be determined and/or processed.
  • the LiDAR system can advantageously be connected to or be part of at least one electronic control device of a vehicle or a machine, in particular a driver assistance system or the like. In this way, at least some of the functions of the vehicle or machine can be operated autonomously or semi-autonomously.
  • the LiDAR system can be used to detect stationary or moving objects, in particular vehicles, people, animals, plants, obstacles, uneven road surfaces, in particular potholes or stones, road boundaries, traffic signs, open spaces, in particular parking spaces, precipitation or the like, and/or movements and /or gestures can be used.
  • the wreath coordinates for the wreath reception areas can be specified in a Cartesian coordinate system. In this way, the coordinates of the approximation surface can be easily determined using trigonometry.
  • the reception matrix coordinates and the distance size coordinates of the wreath reception areas can be specified in a Cartesian coordinate system.
  • object targets that are detected with the wreath reception areas can be specified in the coordinates of the coordinate system.
  • a position of a detected object target relative to the LiDAR system can first be determined in polar coordinates or spherical coordinates.
  • the origin of the corresponding polar or spherical coordinate system can lie in a reference point of the LiDAR system.
  • the polar coordinates or spherical coordinates can contain at least one direction quantity and at least one distance quantity.
  • the direction of the detected object target can be characterized in relation to the reference point.
  • An azimuth and/or an elevation angle can be determined as the directional variable.
  • the at least one distance can be used to characterize a distance of the detected object target from the reference point.
  • the respective wreath coordinates can be determined as Cartesian coordinates of a Cartesian coordinate system.
  • the respective Cartesian coordinates can be determined from the polar coordinates or spherical coordinates of the respective field of view area of the corona reception areas.
  • the approximation surface can first be determined in the Cartesian coordinate system.
  • the Cartesian coordinates of the determined proximity points can then be converted into polar coordinates or spherical coordinates of the polar or spherical coordinate system.
  • an approximation plane can be determined as the approximation surface. In this way, distances from object targets on flat surfaces of objects, in particular street signs or the like, can be determined more precisely.
  • At least one proximity distance variable coordinate of the proximity point can be used as an approximate distance variable for the at least one oversaturated reception area. In this way, a distance to an object target detected with the oversaturated reception area can be determined from the proximity area with little effort.
  • a reception area data set can be created for at least one reception area, wherein the reception area data set can contain at least one coordinate value of a Cartesian coordinate system, in particular at least one reception matrix value, in particular at least one reception matrix coordinate, which represents the position of the reception area within the reception matrix, and / or the reception area data set can contain at least one directional value, in particular at least one angular coordinate, in particular azimuth or elevation angle, which represents the direction of a reception area field of view that can be detected with the corresponding reception area , characterized relative to a reference point in particular of the LiDAR system, and / or the reception area data set can contain at least one distance value, in particular at least one distance size coordinate, which represents at least one distance size, and / or the reception area data set can contain at least one reception size can.
  • the variables determined from the measurements can be assigned to the respective reception area and further processed accordingly.
  • a reception area data set can first be created for at least one reception area, which can contain at least one direction value and at least one distance value.
  • the positions of detected object targets can be specified in polar or spherical coordinates.
  • a reception area data set can be created for at least one wreath reception area, which contains at least one coordinate value of a Cartesian coordinate system.
  • At least one of the coordinate values can be at least one reception matrix value, in particular at least one reception matrix coordinate, which represents the position of the wreath reception area within the reception matrix.
  • a further coordinate value can be determined from the distance size for the at least one wreath reception area.
  • the coordinate values of the Cartesian coordinate system can be derived from directional values. th and distance values of a polar or spherical coordinate system are determined, which were previously created in a corresponding reception area data set for the wreath reception area.
  • the corresponding reception area data set can be provided with at least one oversaturation mark; in particular, at least one reception variable of the reception area data set can be set to a constantly high value. In this way, this reception area data record can be marked for further processing. This means that it can be recognized later that the reception area data set has been provided with a replacement reception size and/or an proximity distance size.
  • all reception variables of the reception area data set can be set to a constantly high value. In this way, the corresponding reception area is clearly marked.
  • At least one amplitude-modulated scanning beam can be sent with the at least one transmitting device and the at least one electromagnetic reception beam coming from the at least one reflected scanning beam and striking at least one reception area can be converted into respective reception variables during at least two recording time ranges, with at least two of the recording time ranges are started out of phase with respect to a modulation period of the at least one transmitted scanning beam.
  • the LiDAR system can be operated using an indirect time-of-flight method, also known as the indirect time-of-flight method.
  • the distance sizes can be determined from phase shifts between the transmitted amplitude-modulated scanning beam and the received reflected scanning beam.
  • the corresponding reception variables can be determined separately for each reception area that is hit with a reception beam, at least for the two recording time ranges. In this way, a corresponding distance variable can be determined separately from the received variables for each reception area.
  • the reception beam's respective reception variables can be determined in the form of an amplitude of a phase image of the reception beam, which is recorded during a recording time range. Amplitudes can be easily determined.
  • the reception variables and/or the distance variables can be determined using a time-of-flight method, in particular an indirect time-of-flight method.
  • a time-of-flight method in particular an indirect time-of-flight method.
  • the LiDAR system can be determined using an indirect time-of-flight method. In this way, the accuracy of distance measurement can be further improved.
  • a reception area can be recognized as oversaturated if it is oversaturated during at least one of the at least two recording time ranges and / or a reception area can be recognized as not oversaturated if it is not oversaturated during all of the at least two recording time ranges. In this way, oversaturation can be detected more reliably. In this way, the overall accuracy of the distance measurement can be further improved.
  • At least part of the at least one monitoring area can be illuminated with at least one electromagnetic flash scanning beam. In this way, at least a part of the at least one monitoring area can be illuminated simultaneously with the same flash scanning beam. In this way, a snapshot of at least one simultaneously illuminated part of the monitoring area can be realized.
  • a flash scanning beam spreads out in different directions in space, similar to a flash light.
  • the at least one monitoring area can be equipped with at least one electromagnetic scanning beam in a spatial dimension, in particular in horizontal direction.
  • a two-dimensional distance image can be realized, in particular with one dimension in the horizontal direction and one dimension in the distance direction.
  • the at least one monitoring area can advantageously be scanned with at least one electromagnetic scanning beam in two spatial dimensions, in particular in the horizontal and vertical directions.
  • at least one electromagnetic scanning beam in two spatial dimensions, in particular in the horizontal and vertical directions.
  • a three-dimensional distance image can be realized, in particular with a dimension in the horizontal direction, a dimension in the vertical direction and a dimension in the distance direction.
  • the object is achieved according to the invention in the LiDAR system in that the LiDAR system has means for carrying out the method according to the invention.
  • the LiDAR system can have means with which it can be recognized if several reception areas are identified as oversaturated and at least some of the oversaturated reception areas form at least a partially coherent saturation field.
  • the LiDAR system can have means with which at least three non-supersaturated reception areas, which are located in a ring of non-supersaturated reception areas that surround the at least one saturation field, can be identified as ring reception areas.
  • the LiDAR system can have means with which respective ring coordinates of a coordinate system can be determined for at least three of the ring reception areas.
  • the ring coordinates can have at least one reception matrix coordinate, which represents the position of the respective ring reception area within the reception matrix, and at least one distance size coordinate, which represents the respective at least one reception size or the distance size determined therefrom.
  • the LiDAR system can have means with which an approximation surface can be realized in the coordinate system from the rim coordinates of at least three of the rim reception areas.
  • the LiDAR system can have means with which an proximity point on the proximity surface can be determined for at least one oversaturated reception area, the position of which within the reception matrix is represented with at least one corresponding reception matrix coordinate, which has the same at least one reception matrix coordinate as the has at least one supersaturated reception area and at least one proximity-distance coordinate.
  • the LiDAR system can have means with which at least one proximity distance variable for the at least one oversaturated reception area can be determined using at least one proximity distance variable coordinate of the proximity point.
  • means for carrying out the method according to the invention can be implemented using software and/or hardware. In this way, the funds can be implemented efficiently.
  • At least some of the means for carrying out the method according to the invention can be implemented with at least one processor of the LiDAR system, in particular a control and evaluation device.
  • at least one processor of the LiDAR system in particular a control and evaluation device.
  • existing components can be used.
  • the reception beams can be received with reception areas of a one-dimensional reception matrix.
  • the receiving device of the LiDAR system can have at least one one-dimensional receiving matrix, in particular a line matrix.
  • the at least one monitoring area can be recorded in a spatially resolved manner in two dimensions, namely one dimension of the reception matrix and a distance dimension.
  • the reception beams can be received with reception areas of a two-dimensional reception matrix.
  • the receiving device can The LiDAR system has at least one two-dimensional reception matrix, in particular an area matrix.
  • the at least one monitoring area can thus be recorded in a spatially resolved manner in three dimensions, namely the two dimensions of the reception matrix and a distance dimension.
  • directions of detected object targets can be determined relative to the LiDAR system.
  • the direction of the corresponding object targets can be determined from the positions of the reception areas within the reception matrix hit by reception beams.
  • At least one receiving matrix can be implemented with at least one CCD chip and/or at least one receiving matrix can be implemented with a 3D ToF imager.
  • the electromagnetic reception beams can be received with the corresponding reception areas, which can be referred to as pixels in CCD chips and 3D ToF images, and converted into electrical reception signals.
  • the corresponding received variables can be determined from the electrical received signals.
  • 3D ToF imagers (3D time-of-flight imagers) are known to be three-dimensional time-of-flight imagers. With 3D ToF images, three-dimensional distance images of surveillance areas can be recorded.
  • the object is achieved according to the invention in the vehicle in that the vehicle has means for carrying out the method according to the invention.
  • the means for carrying out the method according to the invention can be implemented with the at least one LiDAR system, in particular a LiDAR system according to the invention.
  • the at least one LiDAR system can be configured before it is installed on the vehicle.
  • the at least one LiDAR system is part of the vehicle, so that the means of the at least one LiDAR system are also means of the at least one vehicle.
  • the at least one LiDAR system at least one monitoring area in the surroundings of the vehicle and/or in the interior of the vehicle can be monitored for object targets.
  • distances to detected object targets can be determined.
  • the vehicle can have at least one driving assistance system.
  • the vehicle can be operated autonomously or at least partially autonomously.
  • At least one LiDAR system can be functionally connected to at least one driver assistance system of the vehicle.
  • information about the at least one monitoring area, in particular distance variables and/or directional variables to detected object targets, which can be determined with the at least one LiDAR system can be transmitted to the at least one driver assistance system.
  • the vehicle With the at least one driver assistance system, the vehicle can be operated autonomously or at least partially autonomously, taking into account the information about the at least one monitoring area, in particular about objects.
  • Figure 1 shows a front view of a vehicle with a driver assistance system and a LiDAR system for detecting objects
  • Figure 2 shows a functional representation of a part of the vehicle with the driver assistance system and the LiDAR system from Figure 1;
  • Figure 3 shows a front view of a reception matrix of a reception device of the LiDAR system from Figures 1 and 2, the reception matrix having a plurality of reception areas;
  • Figure 4 shows an intensity-time diagram with four exemplary phase images DCSo to DCS3, which are determined with respective phase shifts of 90° from a received light signal of a reflected transmitted light signal of the LiDAR system from Figures 1 and 2 and their amplitudes as received variables for determining distances of objects serve;
  • Figure 5 shows an intensity image of a scene with a retroreflective object in the form of a street sign in a grayscale representation
  • FIG. 6 shows a received size image for one of the phase images DCS for the scene from FIG. 5 in a grayscale representation
  • Figure 7 shows a section along a line of the received size image from Figure 6;
  • Figure 8 shows a detailed view of the intensity image from Figure 5 in the area of the object
  • FIG. 9 shows a detailed view of a distance image for the scene from FIG. 8 in the area of the object, with reception areas which receive received light signals from the object being oversaturated;
  • Figure 10 shows a detailed view of a distance image for the scene from Figure 5 in the area of the object after correction of the oversaturated reception areas;
  • Figure 11 shows a section along a line of the distance image from Figure 9;
  • Figure 12 shows a section along a line of the distance image from Figure 10.
  • the vehicle 10 has a LiDAR system 12, which is designed as a flash LiDAR system.
  • the LiDAR system 12 is, for example, arranged in the front bumper of the vehicle 10.
  • a monitoring area 14 in the direction of travel 16 in front of the vehicle 10 can be monitored for objects 18.
  • the LiDAR system 12 can also be arranged elsewhere on the vehicle 10 and aligned differently.
  • object information for example distance variables D, directional variables and speed variables, can be determined, which characterize distances, directions and speeds of objects 18 relative to the vehicle 10, or relative to a reference point 48 of the LiDAR system 12.
  • the objects 18 can be stationary or moving objects, for example other vehicles, people, animals, plants, obstacles, bumps in the road, for example potholes or stones, road boundaries, traffic signs such as street signs, open spaces, for example parking spaces, precipitation or the like.
  • Each object 18 usually has several object targets 19.
  • An object target 19 is a location of an object 18 at which electromagnetic scanning beams in the form of transmitted light signals 20, which are sent from the LiDAR system 12 into the monitoring area 14, can be reflected.
  • An object 18 in the form of a street sign is indicated as an example in FIG.
  • the object 18 has a flat, retroreflective surface.
  • only one of the object targets 19 of the object 18 is indicated and labeled with a cross.
  • the LiDAR system 12 is connected to a driver assistance system 22. With the driver assistance system 22, the vehicle 10 can be operated autonomously or semi-autonomously.
  • the LiDAR system 12 includes, for example, a transmitting device 24, a receiving device 26 and a control and evaluation device 28.
  • the control and evaluation device 28 is, for example, an electronic control and evaluation device, for example with one or more processors.
  • the functions of the control and evaluation device 28 can be implemented centrally or decentrally using software and/or hardware. Parts of the functions of the control and evaluation device 28 can also be integrated in the transmitting device 24 and/or the receiving device 26.
  • transmission variables can be generated in the form of electrical transmission signals.
  • the transmitting device 24 can be controlled with the electrical transmission signals, so that it sends amplitude-modulated transmitted light signals 20 in the form of laser pulses into the monitoring area 14.
  • the transmitting device 24 has, for example, a laser as a light source.
  • the laser can be used to generate transmitted light signals 20 in the form of laser pulses.
  • the transmitting device 24 has an optical device with which the transmitted light signals 20 are expanded so that they can spread into the entire monitoring area 14 - similar to a flash light. In this way, the entire monitoring area 14 can be illuminated with each transmitted light signal 20.
  • the transmitted light signals 20 can therefore also be referred to as “flash scanning beams”.
  • Transmitted light signals 20 reflected on the object 18 in the direction of the receiving device 26 can be received with the receiving device 26.
  • the reflected beams originating from the scanning beams, namely the transmitted light signals 20, are referred to below as received beams or received light signals 30 for the sake of better distinction.
  • the receiving device 26 can optionally have a received light signal deflection device with which the received light signals 30 are directed to a receiving matrix 32 of the receiving device 26 shown in FIG. 3.
  • the receiving matrix 32 is implemented, for example, with an area sensor in the form of a CCD sensor. Instead of a CCD sensor, a different type of surface sensor, for example an active pixel sensor or the like, can also be used.
  • the reception matrix 34 has a large number of reception areas 34. Each reception area 34 can be implemented, for example, by a group of pixels.
  • the reception matrix 32 described here has, for example, 320 columns with 240 reception areas 34 each. For the sake of clarity, only 7 x 7 of the reception areas 34 are indicated in FIG. Furthermore, the x-axis and the y-axis of an imaginary Cartesian xy coordinate system are shown in FIG. The xy plane runs in the plane of the reception matrix 32.
  • the x-axis runs in the direction of the rows of the reception matrix 32.
  • the x coordinates indicate the columns of the reception matrix 32.
  • the y-axis runs in the direction of the columns of the reception matrix 32.
  • the y-coordinates indicate the rows of the reception matrix 32.
  • the reference point 48 of the LiDAR system 12 is, for example, at the coordinate origin of the xy coordinate system. The reference point 48 can also be located elsewhere.
  • Each reception area 34 has a reception area field of view 50.
  • a reception area field of view 50 is the field of view of a reception area 34.
  • Object targets 19, which are located in a field of view area 52 of the reception area field of view 50, can be detected with the corresponding reception area 34.
  • a reception area field of view 50 is characterized, for example, by a direction, for example two directional angles, in which the reception area field of view 50 extends relative to the reference point 48 of the LiDAR system 12.
  • a field of view area 52 is a region of a reception area field of view 50 at a certain distance, for example relative to the reference point 48 of the LiDAR system. The distance is characterized with a distance size D.
  • the directional angles can be, for example, the azimuth a indicated in FIG.
  • the azimuth a can, for example, be specified in relation to an imaginary axis which runs parallel to a vehicle longitudinal axis of the vehicle 10.
  • the polar angle ⁇ can be specified, for example, in relation to an imaginary axis which runs parallel to a vehicle vertical axis of the vehicle 10.
  • the polar angle ⁇ for a detected object target 19 indicates the elevation of the object target 19.
  • the received variables can also be referred to below as received variables Ai.
  • the received variables Ao, Ai, A2 and A3 are the amplitudes of phase images (differential correlation samples) DCSo, DCS1, DCS2 and DCS3, which can also be referred to below as phase images DCSi for the sake of simplicity.
  • the received variables Ai and the phase images DCSi can be assigned to the respective reception areas 34.
  • Each reception area 34 can be activated via suitable closure means for the detection of received light signals 30 for defined recording time ranges TBo, TB1, TB2 and TB3.
  • the recording time ranges can also be referred to below as recording time ranges TBi.
  • the reception areas 34 can each be activated in four recording time ranges TBo, TB1, TB2 and TB3 for detecting received signals 30.
  • Each recording time range TBi is defined by a start time and an integration period.
  • the time intervals between two defined recording time ranges TBi are smaller than the period IMOD of the modulation period MP.
  • portions of received light signals 30 hitting the respective reception area 34 can be converted into corresponding electrical reception signals.
  • the respective phase images DCSi and their amplitudes Ai can be determined from the received signals, which characterize respective signal sections of the received light signal 30 in the respective recording time ranges TBi.
  • the phase images DCSi and their amplitudes, i.e. the received variables Ai characterize the respective amount of light that is collected during the recording time ranges TBi with the correspondingly activated reception area 34 of the reception matrix 32.
  • each reception area 34 can be activated and read out individually.
  • the closure means can be implemented using software and/or hardware. Such closure means can be implemented as so-called “shutters”.
  • the reception areas 34 can be controlled with corresponding periodic recording control signals in the form of shutter signals.
  • the shutter signals can be triggered via the electrical transmission signals with which the laser of the transmission device 24 is controlled, or together with them.
  • the received variables Ai are thus related to the transmitted light signals 20.
  • the electrical transmission signals can be triggered at a starting time ST.
  • the reception areas 34 are triggered with the corresponding time-offset shutter signals.
  • the receiving device 26 can have optical elements with which received light signals 30 coming from the monitoring area 14 are imaged onto respective receiving areas 34 when viewed in the direction of the receiving areas 34, depending on the direction from which they come. In this way, an individual reception area field of view 50 can be realized for each reception area 34.
  • the direction of an object target 19 on which the transmitted light signal 20 is reflected can thus be determined from the position of the illuminated reception areas 34 within the reception matrix 32.
  • FIG. 4 shows a modulation period MP of a reception envelope 36 of the phase images DCSo, DCSi, DCS2 and DCS3 in a common intensity-time diagram.
  • the axis for the intensity variables is labeled “INT” and the time axis is labeled “t”.
  • the reception envelope 36 is offset in time from the starting time ST.
  • the time offset in the form of a phase difference characterizes the flight time between the emission of the transmitted light signal 20 and the reception of the corresponding received light signal 30.
  • the phase difference can be determined from the received variables Ao, Ai, A2 and A3 of the phase images DCSo, DCS1, DCS2 and DCS3.
  • the distance size D for the reflecting object 18 can be determined from the phase difference 0.
  • the distance The signal size D can also be determined directly from the received sizes Ao, Ai, A2 and A3.
  • the phase shift itself can also be used as a distance variable for distance.
  • the flight time is known to be proportional to the distance of the object target 19 relative to the LiDAR system 12.
  • the reception envelope 36 can be approximated by, for example, four support points in the form of the four phase images DCSo, DCS1, DCS2 and DCS3. Alternatively, the reception envelope 36 can also be approximated by more or fewer support points in the form of phase images.
  • the recording time ranges TBo, TB1, TB2 and TB3 are each started based on a reference event, for example in the form of a trigger signal for the electrical transmission signal at the starting time ST.
  • the modulation period MP of the transmitted light signal 20 extends over 360°.
  • the recording time ranges TBo, TB1, TB2 and TB3 each start at a distance of 90° from one another in relation to the modulation period MP.
  • the recording time ranges TBo, TB1, TB2 and TB3 start with phase shifts of 0°, 90°, 180° and 270°, respectively, compared to the starting time ST.
  • a distance variable D for a detected object 18 can be determined mathematically, for example, from the amplitudes, i.e. the received variables Ao, Ai, A2 and A3, of the phase images DCSo, DCS1, DCS2 and DCS3 for a respective reception area 34 in a manner that is of no further interest here.
  • FIG. 5 shows an intensity image of a scene in a grayscale representation, which was captured with the LiDAR system 12 in a photography mode.
  • 8 shows an enlargement of the intensity image from FIG.
  • the object targets 19 are located on the left side of the object 18 in FIG. 5 at a greater distance from the LiDAR system 12 than the object targets 19 on the right side of the object 18.
  • the 320 columns of the receiving matrix 32 are indicated in the horizontal dimension in the direction of the x-axis. Each column characterizes the horizontal direction from which the received light signals 30 received with the reception areas 34 of the column come, i.e.
  • the 240 lines of the receiving matrix 32 are indicated in the vertical dimension in the direction of the y-axis. Each line characterizes the vertical direction from which the received light signals 30 received with the reception areas 34 of the line come, i.e. in which the corresponding object target 19 is located.
  • the intensity variables INT for the captured scene are defined according to a logarithmic grayscale scale shown next to the intensity images.
  • Figure 6 shows a received size image for one of the phase images DCSi for the scene from Figure 5.
  • the horizontal x-axis indicates the columns and the vertical y-axis indicates the rows of the reception matrix 32.
  • the received quantities A for the captured scene are defined according to a linear grayscale scale shown next to the received quantity image.
  • 7 shows a section through line 220 of the received size image from FIG. 6.
  • the reception areas 34 in the columns 180 to 220 from Figures 6 and 7, which are hit by the reception light signals 30 which come from the object targets 19 of the object 18, are identified as oversaturated.
  • the oversaturated reception areas 34 are already identified on the area sensor, for example the CCD sensor.
  • FIG. 9 shows a distance image of the enlarged scene from FIG. 8 in a grayscale representation.
  • the range image was captured using the LiDAR system 12 with an integration time of 1000 ps.
  • Figure 10 shows a corrected distance image of the same scene in grayscale.
  • the 320 columns of the receiving matrix 32 are indicated in the horizontal dimension in the direction of the x-axis.
  • the 240 lines of the reception matrix 32 are indicated in the direction of the y-axis.
  • the distance sizes D for the captured scene are defined according to a linear grayscale scale shown next to the distance image.
  • Figure 11 shows a section through line 220 of the distance image from Figure 9.
  • Figure 12 shows a section through line 220 of the corrected distance image from Figure 10.
  • Such radiation effects can lead to a ring of radiation around the supersaturated reception areas 34, which can also be referred to as a corona.
  • the reference numbers for the non-supersaturated reception areas 34 affected by the radiation halo are given the index “k” below for better differentiation. It has been found that the radiation effect leads to reception variables being determined with the non-oversaturated reception areas 34k in the vicinity of oversaturated reception areas 34, which match the distances of the retroreflective object targets 19, which lead to the oversaturation of the neighboring oversaturated reception areas 34.
  • the received light signals 30 coming from the retroreflective object targets 19 of the object 18 lead to the oversaturation of a large number of reception areas 34, which form a coherent saturation field 38.
  • the saturation field 38 is surrounded by a ring 40, which consists of non-supersaturated reception areas 34k, which are affected by the radiation effect of the neighboring supersaturated reception areas 34.
  • the distance sizes determined with the ring reception areas 34k characterize (7) D the distances of the retroreflective object targets 19 of the object 18, which are recorded with the supersaturated reception areas 34 adjacent to the ring reception areas 34k. Since, as already mentioned, the object 18 shown as an example is arranged obliquely to the LiDAR system 12, larger distance sizes D are determined for the ring reception areas 34k on the left side than for the ring reception areas 34k on the right side.
  • the LiDAR system 12 is as follows operated as described:
  • an amplitude-modulated flash transmitted light signal 20 is sent into the monitoring area 14 using the transmitting device 24.
  • the transmitted light signal 20, if present, is reflected on object targets 19. In the scene shown in Figures 5 and 8, part of the transmitted light signal 20 is reflected at the object targets 19 of the object 18 and radiated as received light signals 30 in the direction of the receiving device 26.
  • the receiving device 26 is used to carry out a measurement with an integration period during which the received light signals 30 are received from the monitoring area 14 with the reception areas 34 in the four recording time ranges TBo, TBi, TB2 and TB3.
  • the length of the integration period is chosen so that even weaker received light signals 30 from weakly or normally reflective object targets 19, none of which are present in the exemplary scene, are sufficient to generate received signals that can be distinguished from noise in the reception areas 34.
  • the integration time is 1000 ps.
  • the received light signals 30 reflected at the retroreflective object targets 19 of the retroreflective object 18 shown are so strong that they lead to an overdrive in the corresponding corresponding reception areas 34 of the reception matrix 32.
  • a reception area data set is determined for each reception area 34, for example in the form (a, ß, D, Ao, Ai, A2, A3) with the coordinates of the spherical coordinate system and the four reception variables Ao, Ai, A2, A3.
  • the reception area data sets are stored in a storage medium of the LiDAR system 12.
  • the coordinates of the reception areas 34 each have two directional coordinates, for example in the form of azimuth a and polar angle ⁇ , and a distance size coordinate in the form of the distance size D.
  • the directional coordinates a and ⁇ represent the reception area field of view 50 of the reception area 34.
  • the distance size coordinate comes from the received light signals 30 received with the reception area 34 and represents the respective distance size D.
  • the received portion of the received light signals 30 is converted into the four corresponding reception variables Ao, Ai, A2 and A3 assigned to the respective reception area 34 for each of the reception areas 34.
  • the oversaturated reception areas 34 were already identified before the reception light signals 30 were converted into the reception variables Ao, Ai, A2 and A3.
  • a reception area 34 is identified as oversaturated, a constantly large saturation marking value As is used for the reception variables Ao, Ai, A2, and A3. In this way, the corresponding reception area 34 is marked as an oversaturated reception area 34 for the further processing phases.
  • the saturation mark value As is, for example, 4000.
  • reception areas 34 are identified as oversaturated and form a partially coherent saturation field 38.
  • the reception areas 34 which are hit by the reception light signals 30 coming from the retroreflective object 18, are oversaturated. These oversaturated reception areas 34 form an additional coherent saturation field 38, which is shown in white in the grayscale representation of Figure 6. In the section of the reception size image shown in FIG. 7, the saturation marking value As is entered for the oversaturated reception areas 34, since these were identified as oversaturated.
  • the ring reception areas 34k of the ring 40, which surrounds the saturation field 38, are then identified.
  • a corresponding distance size D is determined from their assigned reception variables Ao, Ai, A2 and A3 and stored in the respective reception area data set.
  • the corresponding distance image is shown in Figure 9.
  • Figure 11 shows the section through the distance image.
  • the spherical coordinates a, ⁇ and D for the field of view areas 52 recorded with the ring reception areas 34k, which are designated 52k for easier distinction, are converted into Cartesian coordinates.
  • the spherical coordinates can be converted into Cartesian coordinates of a three-dimensional Cartesian x-y-z coordinate system based on the two-dimensional x-y coordinate system for the reception matrix 32 explained above.
  • the respective distance size D will be used as the z coordinate of the x-y-z coordinate system.
  • the reception matrix coordinates x and y represent the position of the reception area 34 within the reception matrix 32 and thus also the corresponding reception area field of view 50.
  • the Cartesian coordinates x and y are additionally stored, for example, in the reception area data sets for the ring reception areas 34k.
  • a reception area data set for a reception area 34 can look like this: (a, ß, D, x, y, Ao, Ai, A2, A3).
  • An approximation surface 44 is realized in the xyz coordinate system from the rim coordinates x, y and D of the rim reception areas 34k.
  • the corresponding corrected distance image is shown in FIG. 10, where the z-axis is labeled with the distance size D.
  • Figure 12 shows the section through the corrected distance image.
  • the proximity surface 44 is shown in white in FIG. 10 and shown in section as a line in FIG. 12.
  • the proximity surface 44 has the shape of a plane.
  • the type of proximity surface 44 is adapted to the shape of highly reflective objects 18 that are expected during regular operation of the vehicle 10. Street signs have flat retroreflective surfaces, so that a good approximation for the object 18 in the form of a street sign in the exemplary scene can be achieved with a flat approximation surface 44, i.e. an approximation plane.
  • An approximation surface 44 can also be referred to as a “fit plane” or “best-fit plane”.
  • an approximation method plane fit method which is of no further interest here is used, for example.
  • Each proximity point 46 on the proximity surface 44 is represented in the xyz coordinate system by two reception matrix coordinates, namely the x coordinate and the y coordinate, and an proximity distance size coordinate (approximation z coordinate) in the form of an approximation -Distance size D* characterized.
  • the reception matrix coordinates x and y enable the assignment to a reception area 34 and the corresponding reception area field of view 50.
  • the proximity distance size coordinate makes it possible to assign the proximity distance size D* of the proximity point 46 to a corresponding distance size D (z coordinate) in the x-y-z coordinate system.
  • the proximity distance variable D* can be viewed as the length of a vector between the reception area 34 and the corresponding proximity point 46.
  • the length of the vector therefore indicates a distance between the reception area 34 and the corresponding proximity point 46.
  • the proximity distance variable D* characterizes a virtual distance of the proximity point 46 to the reception area 34.
  • the virtual distance of the proximity point 46 corresponds to the distance of a virtual object target 19 to the oversaturated reception area 34, the distance coming from the virtual object target 19 and with the oversaturated reception area 34 received received light signals 30 cause a distance size coordinate in the form of a distance size D, the value of which corresponds to the value of the proximity distance size coordinate, namely the proximity distance size D *, of the proximity point 46.
  • the corresponding proximity point 46 is determined on the proximity surface 44, which has the same reception matrix coordinates x and y as the oversaturated reception area 34.
  • the proximity distance variable D* of the proximity point 46 is used as the distance variable for the corresponding oversaturated reception area 34.
  • a reception area data record results in the form (x, y, D*, Ak, Ak, Ak, Ak).
  • the proximity distance variable D* represents the distance from the oversaturated reception area 34 to the object target 19, from which the received light signals 30 come, which lead to the oversaturation of the reception area 34.
  • the Cartesian coordinates x, y and z of the proximity point 46, where the z coordinate is specified with the proximity distance variable D*, are then converted into spherical coordinates a, ⁇ and D* of the spherical coordinate system.
  • the proximity distance variable D* can be used directly as the radius coordinate of the spherical coordinate system.
  • the spherical coordinates a and ⁇ can, for example, also be stored in the reception area data set of the oversaturated reception area 34.
  • the corresponding reception area data set can now be as follows: (a, ß, x, y, D*, Ak, Ak, Ak, Ak).
  • the distances from neighboring object targets 19, which lead to oversaturations in reception areas 34, can be determined separately.
  • an orientation of the surface of the object 18 with the object targets 19 leading to supersaturation can be determined relative to the LiDAR system 12.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention concerne un procédé de fonctionnement d'un système LIDAR, dans lequel un faisceau de balayage électromagnétique est transmis dans une région de surveillance. Un faisceau de réception, provenant du faisceau de balayage réfléchi sur une cible d'objet, est reçu par des régions de réception d'une matrice de réception. Au moins une partie du faisceau de réception reçu est convertie en variables de réception. Les variables de réception sont chacune attribuées aux régions de réception. Il est déterminé si au moins une région de réception est sursaturée par le faisceau de réception. Pour au moins une partie des régions de réception non sursaturées, à partir de variables de réception respectives, au moins une variable de distance respective est déterminée, qui caractérise une distance de la cible d'objet par rapport à la région de réception correspondante. Dans le cas où de multiples régions de réception sursaturées forment un champ de saturation (38), des régions de réception sursaturées, positionnées dans un anneau (40) autour dudit au moins un champs de saturation (38), sont identifiées en tant que régions de réception en anneau (34k). Des coordonnées d'anneau respectives (x, D) sont déterminées pour au moins trois régions de réception en anneau (34k). Les coordonnées d'anneau (x, D) présentent au moins une coordonnée de matrice de réception (x) représentant la position de la région de réception en anneau (34k) à l'intérieur de la matrice de réception, et au moins une coordonnée de variable de distance (D) représentant la variable de distance déterminée respective. Un plan d'ajustement optimal (44) est formé à partir des coordonnées d'anneau (x, D). Pour une région de réception sursaturée, un point d'approximation (46) est déterminé sur le plan d'ajustement optimal (33), qui a la même coordonnée de matrice de réception (x) que ladite au moins une région de réception sursaturée et au moins une coordonnée de variable de distance d'approximation (D*). À l'aide de la coordonnée de variable de distance d'approximation (D*) du point d'approximation (46), une variable de distance d'approximation (D*) est déterminée pour la région de réception sursaturée.
PCT/EP2023/066060 2022-06-20 2023-06-15 Procédé de fonctionnement d'un système lidar, système lidar, et véhicule comprenant au moins un système lidar WO2023247304A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102022115275.8 2022-06-20
DE102022115275.8A DE102022115275A1 (de) 2022-06-20 2022-06-20 Verfahren zum Betreiben eines LiDAR-Systems, LiDAR-System und Fahrzeug mit wenigstens einem LiDAR-System

Publications (1)

Publication Number Publication Date
WO2023247304A1 true WO2023247304A1 (fr) 2023-12-28

Family

ID=87001852

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/066060 WO2023247304A1 (fr) 2022-06-20 2023-06-15 Procédé de fonctionnement d'un système lidar, système lidar, et véhicule comprenant au moins un système lidar

Country Status (2)

Country Link
DE (1) DE102022115275A1 (fr)
WO (1) WO2023247304A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200011972A1 (en) 2018-07-04 2020-01-09 Hitachi-Lg Data Storage, Inc. Distance measurement device
US20200182971A1 (en) * 2018-12-07 2020-06-11 Infineon Technologies Ag Time of Flight Sensor Module, Method, Apparatus and Computer Program for Determining Distance Information based on Time of Flight Sensor Data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200011972A1 (en) 2018-07-04 2020-01-09 Hitachi-Lg Data Storage, Inc. Distance measurement device
US20200182971A1 (en) * 2018-12-07 2020-06-11 Infineon Technologies Ag Time of Flight Sensor Module, Method, Apparatus and Computer Program for Determining Distance Information based on Time of Flight Sensor Data

Also Published As

Publication number Publication date
DE102022115275A1 (de) 2023-12-21

Similar Documents

Publication Publication Date Title
AT412028B (de) Einrichtung zur aufnahme eines objektraumes
DE102010012811B4 (de) Verfahren zur Messung von Geschwindigkeiten und Zuordnung der gemessenen Geschwindigkeiten zu angemessenen Fahrzeugen durch Erfassen und Zusammenführen von Objekt-Trackingdaten und Bild-Trackingdaten
DE102007004349A1 (de) Nachtsichtsystem, insbesondere für ein Fahrzeug, und Verfahren zum Erstellen eines Nachtsichtbildes
EP1953568A1 (fr) Élément semi-conducteur d'imageur, système de caméra et procédé de production d'une image
DE102009007408B4 (de) Vorrichtung zur Umfelderfassung eines Kraftfahrzeugs
DE102005037094B3 (de) Kalibrierverfahren für einen Sensor zur Abstandsmessung
WO2023247302A1 (fr) Procédé de détermination d'au moins une fonction de correction pour un système lidar, système lidar, véhicule comprenant au moins un système lidar et système de mesure
DE10122664A1 (de) Kalibrierverfahren
DE102020110809B3 (de) Verfahren und Vorrichtung zum Erkennen von Blooming in einer Lidarmessung
WO2023247304A1 (fr) Procédé de fonctionnement d'un système lidar, système lidar, et véhicule comprenant au moins un système lidar
WO2019048168A1 (fr) Imagerie stocastique d'un systeme lidar
DE102022115273A1 (de) Verfahren zum Betreiben eines LiDAR-Systems, LiDAR-System und Fahrzeug aufweisend wenigstens ein LiDAR-System
DE102020124017A1 (de) Verfahren zum Betreiben einer optischen Detektionsvorrichtung, optische Detektionsvorrichtung und Fahrzeug mit wenigstens einer optischen Detektionsvorrichtung
DE102016118481A1 (de) Abtasteinheit einer optischen Sende- und Empfangseinrichtung einer optischen Detektionsvorrichtung eines Fahrzeugs
WO2023247395A1 (fr) Procédé de fonctionnement d'un système lidar à correction de lumière parasite, système lidar correspondant et véhicule
DE102011007464A1 (de) Verfahren und Vorrichtung zur Visualisierung einer Szene
WO2021001178A1 (fr) Dispositif d'adaptation et dispositif de mesure lidar
DE102022115268A1 (de) Verfahren zum Betreiben eines Flash-LiDAR-Systems für ein Fahrzeug, Flash- LiDAR-System und Fahrzeug
DE102020124023A1 (de) Verfahren zum Detektieren von Objekten und Detektionsvorrichtung
EP3945339A1 (fr) Dispositif de détection optique permettant de surveiller au moins une zone de surveillance au niveau d'objets et procédé de fonctionnement d'un dispositif de détection optique
EP3994482A1 (fr) Dispositif de mesure optique pour la détermination d'informations d'objet pour des objets dans au moins une zone de surveillance
DE102022119584A1 (de) Verfahrensüberprüfung eine Ausrichtung wenigstens einer optischen Einrichtung eines LiDAR-Systems, LiDAR-System, Fahrassistenzsystem und Fahrzeug mit wenigstens einem LiDAR-System
DE102021129091A1 (de) Verfahren zum Betreiben einer Detektionsvorrichtung zur ortsaufgelösten Überwachung wenigstens eines Überwachungsbereichs, Detektionsvorrichtung, Fahrzeug mit wenigstens einer Detektionsvorrichtung
DE102021119239A1 (de) Verfahren zum Betreiben einer optischen Detektionsvorrichtung, optische Detektionsvorrichtung zur Überwachung wenigstens eines Überwachungsbereichs und Fahrzeug mit wenigstens einer Detektionsvorrichtung
DE102021126999A1 (de) Verfahren zum Betreiben eines LiDAR-Systems, LiDAR-System und Fahrzeug aufweisend wenigstens eine LiDAR-System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23733902

Country of ref document: EP

Kind code of ref document: A1