WO2022258444A1 - Procédé et dispositif de détermination d'une portée d'un capteur - Google Patents

Procédé et dispositif de détermination d'une portée d'un capteur Download PDF

Info

Publication number
WO2022258444A1
WO2022258444A1 PCT/EP2022/064843 EP2022064843W WO2022258444A1 WO 2022258444 A1 WO2022258444 A1 WO 2022258444A1 EP 2022064843 W EP2022064843 W EP 2022064843W WO 2022258444 A1 WO2022258444 A1 WO 2022258444A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
point cloud
training
feature vector
points
Prior art date
Application number
PCT/EP2022/064843
Other languages
German (de)
English (en)
Inventor
Juan Carlos Garza Fernandez
Sebastien Lemetter
Gabriela Samagaio
Joao Andrade
Farooq Ahmed Zuberi
Carl Mueller-Roemer
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2022258444A1 publication Critical patent/WO2022258444A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/523Details of pulse systems
    • G01S7/526Receivers
    • G01S7/53Means for transforming coordinates or for evaluating data, e.g. using computers

Definitions

  • the present invention relates to a method and a device for determining a range of a sensor, in particular a lidar sensor.
  • Environment recognition systems are known from the prior art, which are used, for example, in means of locomotion, which are set up on the basis of one or more sensors to record and analyze a current environment.
  • a result of this analysis is used, for example, in information and/or warning systems for such means of transportation and/or in systems for autonomous and/or semi-autonomous ferry operations, etc.
  • lidar and/or radar and/or ultrasonic sensors and/or cameras and/or stereo cameras are used as sensors of such environment recognition systems. It is also known to generate a 3D point cloud from the respective measurement signals of the sensors, which represent reflections of measurement signals (e.g. laser signals) in the area surrounding the sensor.
  • measurement signals e.g. laser signals
  • US2016154999 A1 discloses a method for obtaining a 3D point cloud relating to an object of interest, wherein first ground points and/or object points are removed from the 3D point cloud using an unsupervised segmentation method, in order to then identify vertical objects using a supervised segmentation method.
  • US9360554 B2 discloses a method and a device for object recognition based on a lidar array, which, among other things, provides information about the reflection properties of illuminated targets.
  • a method for determining a range of a sensor is proposed.
  • a current 3D point cloud representing the surroundings of a sensor is used, for example, by an evaluation unit according to the invention, which is set up to execute the method steps according to the invention (e.g. using a computer program).
  • the 3D point cloud is preferably generated by a processing unit of the sensor (e.g. an ASIC, etc.), it being alternatively or additionally possible to generate the 3D point cloud by the evaluation unit according to the invention and/or a different processing unit.
  • the sensor is not limited to a specific type of sensor or sensor technology, as long as a 3D point cloud representing the surroundings of the sensor can be generated with a required local resolution on the basis of the sensor.
  • the sensor is advantageously a sensor of a vehicle, in particular a road vehicle such as a car or a truck, without being restricted to such an area of use.
  • At least one feature vector is extracted from the 3D point cloud, preferably using the evaluation unit according to the invention, the feature on which the feature vector is based in each case being related to a current range of the sensor. It should be pointed out that a number of elements from which a respective feature vector is composed is not limited to a specific number. If several feature vectors are used, it is also possible that the number of elements per feature vector differs from one another.
  • a current range of the sensor is determined on the basis of a regression method (e.g. a linear or non-linear regression method) determined by entering the at least one feature vector as an influencing variable in a regression model and taking the current range as a target variable from the regression model, with the regression model having been adapted or trained beforehand on the basis of a training data set, which has a large number of predefined training sessions has feature vectors and the respective associated training ranges, the training feature vectors at least partially representing different ranges.
  • a regression method e.g. a linear or non-linear regression method
  • the information obtained on the basis of the method described above about the current range of the sensor is transmitted, for example, to a downstream environment detection system and/or to an information system (e.g. of a vehicle), so that they can react appropriately if a current Sensor range changed.
  • a reaction such as a warning to a user or an automatic intervention in a system comprising the sensor can take place, for example, if the current range deviates from a maximum range of the sensor by a predefined factor.
  • Such a reduction in range can be caused, for example, by rain and/or fog and/or smoke in the area and/or by dirt on an area interface (e.g. a protective glass) of the sensor, etc.
  • the feature vector is a first feature vector and at least one second feature vector is extracted from the 3D point cloud, the second feature vector representing a feature that differs from the first feature vector.
  • the current range of the sensor is determined by means of the regression method at least on the basis of the first feature vector and the second feature vector as the respective influencing variables, with the regression model having been previously adapted or trained on the basis of a training data set which has at least a large number of predefined first training feature vectors and has a large number of predefined second training feature vectors and the respective associated training ranges, and wherein the first training feature vectors and the second training feature vectors are each at least partially different represent ranges.
  • mutually deviating third feature vectors, possibly fourth feature vectors and possibly further feature vectors are advantageously used to determine the current range on the basis of a regression model adapted or trained in this way.
  • Using a larger number of feature vectors that deviate from one another may contribute to a more reliable determination of the current range, since different features of the 3D point cloud can be differently well suited for determining the current range, depending on the environmental conditions that are present at the time.
  • the regression method is preferably based on a machine learning method, which in particular has a support vector machine algorithm and/or a random forest algorithm and/or a deep learning algorithm and/or a different algorithm for a machine learning method.
  • the current range of the sensor is determined for the entire field of view of the sensor or for different partial areas of the field of view of the sensor.
  • each of these sub-areas represents a specific volume of the 3D point cloud or represents a surface onto which the 3D point cloud was projected (e.g. a horizontal surface onto which the points of the 3D point cloud are projected vertically Direction are projected), wherein the sub-areas can be disjointly adjoining sub-areas of the 3D point cloud or overlapping sub-areas of the 3D point cloud.
  • different road types, elevation profiles, weather conditions, traffic conditions, etc. come into consideration as such boundary conditions.
  • a particularly high degree of flexibility and/or reliability when determining the current range can be achieved by the adaptation options described above.
  • the regression model is additionally adapted or trained, taking into account a confidence value for the respective range intervals. It is then possible on the basis of this regression model to determine, in addition to the current range, a current confidence value that corresponds to this range and represents a measure of the reliability of the determined current range value.
  • the confidence value is preferably taken into account in such a way that a regression model that is initially adapted or trained by means of training feature vectors is adapted or trained in a subsequent step with regard to corresponding confidence values.
  • a number of boundary conditions influencing the range are first established in the area surrounding the sensor, for which current range determinations are carried out using the regression model that has already been adapted or trained.
  • the reliability or accuracy of the current range determined in each case can now be checked manually and linked to the respective corresponding confidence values, which can then be used in the subsequent second training step.
  • the confidence values are established, for example, for predefined range intervals of 5 m, 10 m or intervals deviating therefrom.
  • each feature vector is formed on the basis of one of the features of the 3D point cloud described below, with features that differ from the features described here being able to be used for the method according to the invention as an alternative or in addition.
  • Characteristics which can essentially be extracted directly on the basis of the 3D point cloud or on the basis of a sub-area of the 3D point cloud to be currently observed are, for example, a number of pixels of a detector surface of the sensor which currently do not generate any points in the 3D point cloud and/ or a number and/or a predefined extent of pixel groups (called “clusters”), which currently do not generate any points of the 3D point cloud and/or a number of pixels, which generate at least one point of the 3D point cloud, while their neighboring pixels do not generate any points of the Generate a 3D point cloud (ie a number of isolated pixels, which usually represent a noise component) and/or a number of points in the 3D point cloud over the distance.
  • the number of points in the 3D point cloud over the distance are determined, for example, in such a way that the field of view or the partial area of the field of view to be currently considered Sensor is divided into a plurality of segments, which divided the field of view or the part of the field of view based on the sensor position in the detection direction of the sensor.
  • Such a subdivision can preferably be a uniform subdivision, so that, for example, starting from the sensor position, a new segment is started every 10 m or 20 m.
  • the segmentation takes place, for example, on the basis of concentric segments of a sphere or segments of a circle, which subdivide the field of view or the part of the field of view to be viewed, without being restricted to this type of subdivision.
  • distances between respective segments e.g. distances between spherical segments or circle segments
  • the number of points of the 3D point cloud described above can be calculated after the segmentation, for example, in such a way that the number of points of the 3D point cloud present in this segment is determined for each segment and the respective result values are combined in a sequence to form a feature vector , which corresponds to the respective distances of their associated segments from the position of the sensor.
  • the features described below relate to those points in the 3D point cloud that were previously classified as either ground points or object points. Such a classification takes place, for example, using preprocessing methods known from the prior art for classifying or segmenting the 3D point cloud into ground points and object points.
  • Ground point-related features are, for example, a density of ground points in the 3D point cloud over distance (e.g.
  • the following features refer to object-point-related features of the 3D point cloud and include, for example, a distance from that grouping of object points in the 3D point cloud that represents the greatest (e.g. average) distance to the sensor position and/or an average light intensity of that grouping of object points the 3D point cloud that has the greatest distance from the sensor position and/or a number of points from that grouping of object points in the 3D point cloud that represents the greatest distance from the sensor position.
  • the sensor is advantageously a lidar sensor and/or a radar sensor and/or a different sensor.
  • the method according to the invention is carried out particularly advantageously on the basis of one or more lidar sensors of a vehicle.
  • a device for determining a range of a sensor having a sensor and an evaluation unit.
  • the evaluation unit is designed, for example, as an ASIC, FPGA, processor, digital signal processor, microcontroller or the like.
  • the evaluation unit is preferably connected in terms of information technology to a memory unit in which, for example, a computer program is stored that is set up to execute the method according to the invention described above.
  • the storage unit can be used to store and provide data received and/or calculated by the evaluation unit.
  • the evaluation unit is set up to use a current 3D point cloud representing an environment of the sensor, which was detected by the sensor, and at least one feature vector from the 3D To extract point cloud, the feature vector underlying feature is related to a current range of the sensor.
  • the evaluation unit is set up to determine a current range of the sensor based on a regression method by entering the at least one feature vector as an influencing variable in a regression model and taking the current range as a target variable from the regression model, the regression model being based on a training data set is adapted, which has a large number of predefined training feature vectors and the respectively associated training ranges, the training feature vectors at least partially representing different ranges.
  • the evaluation unit is generally set up to carry out the method according to the invention described above.
  • the features, feature combinations and the resulting advantages correspond to those stated in connection with the first-mentioned aspect of the invention in such a way that, to avoid repetition, reference is made to the above statements.
  • FIG. 1 shows a flowchart representing an exemplary embodiment of a method according to the invention
  • FIG. 2 shows a schematic overview of a field of view and respective partial fields of view of a sensor used for the method according to the invention.
  • FIG. 3 shows a schematic overview of a vehicle with a device according to the invention.
  • FIG. 1 shows a flowchart representing an exemplary embodiment of a method according to the invention.
  • step 100 of the method according to the invention a 3D point cloud generated by a lidar sensor 10 of a vehicle 90 is received by an evaluation unit 70 of the vehicle 90 and used therein.
  • step 150 of the method according to the invention the 3D point cloud, which represents an entire field of view 40 of the lidar sensor 19, is divided into a number of partial fields of view 45, with the subdivision taking place depending on current boundary conditions, which include current weather conditions here.
  • step 200 of the method according to the invention general feature vectors of the 3D point cloud, ground point-related feature vectors and object point-related feature vectors of the 3D point cloud are extracted from the points of the 3D point cloud corresponding to the respective partial fields of view 45, with the features corresponding to each feature vector each having a range of Lidar sensor 10 related.
  • Information about a classification of respective points of the 3D point cloud as ground points or as object points is made available to the evaluation unit 70 in the course of the provision of the 3D point cloud by the lidar sensor 10 .
  • a density 63 of ground points of the 3D point cloud over distance and a rate of change 64 of the number of ground points of the 3D point cloud over distance are extracted as ground point-related feature vectors.
  • a distance 65 of that grouping of object points in the 3D point cloud that represents the greatest distance from the lidar sensor position and an average light intensity 66 of that grouping of object points in the 3D point cloud that has the greatest distance from the lidar sensor position are used here as object-point-related feature vectors. extracted.
  • a current range 5 of the lidar sensor 10 and a current confidence value 50 corresponding to this based on a support vector machine algorithm, which represents a regression model 20, in an execution phase 85 (or test phase) of the support -Vector-Machine- Algorithm' is determined by entering the respective extracted feature vectors as influencing variables into the Support-Vector-Machine-Algorithm and using the current range 5 and the current confidence value 50 for the current range 5 as target variables from the Support-Vector- Machine algorithm can be taken.
  • the support vector machine algorithm was trained on the basis of a training data set 30, which has a large number of predefined training feature vectors and associated training ranges and training confidence values for each extracted feature, with the training feature vectors each at least partially represent different ranges.
  • a number and a type of feature vectors used in each case for determining the current range 5 and the current confidence value 50 for the current range 5 are defined as a function of current traffic conditions.
  • FIG. 2 shows a schematic overview of a field of view 40 and respective partial fields of view 45 of a sensor 10 used for the method according to the invention. Using the method according to the invention, individual current ranges 5 are then determined for each partial field of view 45 and then transmitted to an environment recognition system 95 .
  • FIG. 3 shows a schematic overview of a vehicle 90, which is a passenger car here, with a device according to the invention.
  • the device includes an evaluation unit 70 which is designed here as an ASIC and which is connected to a storage unit 75 in terms of information technology.
  • evaluation unit 70 is connected in terms of information technology to a lidar sensor 10 of vehicle 90, which has a field of view 40 and a maximum range 30, so that evaluation unit 70 is set up on the basis of this configuration to carry out the method according to the invention described above in order to to determine the current range 5 and a current confidence value 50 for the current range 5 . Respective results for the current reach 5 and the current
  • the evaluation unit 70 transmits the confidence value 50 to an environment detection system 95 of the vehicle 90 , which takes these values into account in the analysis of the environment of the vehicle 90 together with measured values from the lidar sensor 10 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un procédé et un dispositif permettant de déterminer la portée d'un capteur. Le procédé comprend les étapes suivantes : l'utilisation (100) d'un nuage de points 3D courant qui est représentatif d'un environnement d'un capteur (10), l'extraction (200) d'au moins un vecteur de caractéristiques du nuage de points 3D, la caractéristique sous-jacente au vecteur de caractéristiques respectif étant liée à une portée courante (5) du capteur (10), et la détermination (300) d'une portée courante (5) du capteur (10) sur la base d'un procédé de régression, ledit au moins un vecteur caractéristique étant entré en tant que variable d'influence dans un modèle de régression (20) et la portée actuelle (5) étant prise en tant que variable cible du modèle de régression (20), le modèle de régression (20) ayant été préalablement adapté (entraîné) sur la base d'un ensemble de données d'entraînement qui comprend une pluralité de vecteurs caractéristiques prédéfinis relatifs à l'entraînement et les portées associées respectives relatives à l'entraînement, les vecteurs caractéristiques relatifs à l'entraînement représentant au moins en partie différentes portées.
PCT/EP2022/064843 2021-06-08 2022-06-01 Procédé et dispositif de détermination d'une portée d'un capteur WO2022258444A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021205750.0 2021-06-08
DE102021205750.0A DE102021205750A1 (de) 2021-06-08 2021-06-08 Verfahren und Vorrichtung zum Ermitteln einer Reichweite eines Sensors

Publications (1)

Publication Number Publication Date
WO2022258444A1 true WO2022258444A1 (fr) 2022-12-15

Family

ID=82258480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/064843 WO2022258444A1 (fr) 2021-06-08 2022-06-01 Procédé et dispositif de détermination d'une portée d'un capteur

Country Status (2)

Country Link
DE (1) DE102021205750A1 (fr)
WO (1) WO2022258444A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160154999A1 (en) 2014-12-02 2016-06-02 Nokia Technologies Oy Objection recognition in a 3d scene
US9360554B2 (en) 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
DE102018008442A1 (de) * 2018-10-26 2019-03-28 Daimler Ag Verfahren zur Wetter- und/oder Sichtweitenerkennung
DE102018209595A1 (de) * 2018-06-14 2019-12-19 Robert Bosch Gmbh Verfahren zum automatischen Bestimmen eines Straßenzustands
CN112417757A (zh) * 2020-11-18 2021-02-26 中国汽车技术研究中心有限公司 车载雷达信号级仿真方法、装置、设备及可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9360554B2 (en) 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
US20160154999A1 (en) 2014-12-02 2016-06-02 Nokia Technologies Oy Objection recognition in a 3d scene
DE102018209595A1 (de) * 2018-06-14 2019-12-19 Robert Bosch Gmbh Verfahren zum automatischen Bestimmen eines Straßenzustands
DE102018008442A1 (de) * 2018-10-26 2019-03-28 Daimler Ag Verfahren zur Wetter- und/oder Sichtweitenerkennung
CN112417757A (zh) * 2020-11-18 2021-02-26 中国汽车技术研究中心有限公司 车载雷达信号级仿真方法、装置、设备及可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BROWNLEE JASON .: "How to Choose a Feature Selection Method For Machine Learning", MACHINE LEARNING MASTERY, 27 November 2019 (2019-11-27), XP055934692, Retrieved from the Internet <URL:https://machinelearningmastery.com/feature-selection-with-real-and-categorical-data> [retrieved on 20220623] *

Also Published As

Publication number Publication date
DE102021205750A1 (de) 2022-12-08

Similar Documents

Publication Publication Date Title
DE102006005512B4 (de) System und Verfahren zur Messung der Entfernung eines vorausfahrenden Fahrzeugs
DE102009006113A1 (de) Vorrichtung und Verfahren zur Sensorfusion mit dynamischen Objekten
DE102011117585A1 (de) Systeme und Verfahren zum Verfolgen von Objekten
DE102018008442A1 (de) Verfahren zur Wetter- und/oder Sichtweitenerkennung
DE102016209810A1 (de) Verfahren zur Erkennung wenigstens einer Parklücke für ein Fahrzeug
DE102017102506A1 (de) Fahrzeugerfassung und -verfolgung basierend auf rädern unter verwendung von radar und sicht
DE102018132805A1 (de) Verfahren für eine verbesserte Objekterfassung
WO2018202552A1 (fr) Procédé et dispositif servant à classer des objets dans le champ environnant d&#39;un véhicule automobile
DE102004032118A1 (de) Objekterkennungsverfahren für Fahrzeuge
DE102019209846A1 (de) Verfahren zum Betreiben einer 3D-Distanzsensorvorrichtung
EP3663881B1 (fr) Procédé de commande d&#39;un véhicule autonome en fonction des vecteurs de mouvement estimés
DE102019210767A1 (de) Verfahren und Vorrichtung zum Fusionieren einer Mehrzahl von Signalen einer Ultraschallsensorik eines Fortbewegungsmittels
WO2021170321A1 (fr) Procédé de détection d&#39;objets en mouvement dans l&#39;environnement d&#39;un véhicule, et véhicule à moteur
DE102020214596A1 (de) Verfahren zum Erzeugen von Trainingsdaten für ein Erkennungsmodell zum Erkennen von Objekten in Sensordaten einer Umfeldsensorik eines Fahrzeugs, Verfahren zum Erzeugen eines solchen Erkennungsmodells und Verfahren zum Ansteuern einer Aktorik eines Fahrzeugs
DE102018121158A1 (de) Verfahren zum Erfassen von Bodenabtastpunkten und Fahrerunterstützungssystem, das dafür konfiguriert ist, ein derartiges Verfahren auszuführen
WO2022258444A1 (fr) Procédé et dispositif de détermination d&#39;une portée d&#39;un capteur
EP4107545A1 (fr) Procédé d&#39;estimation d&#39;une vitesse intrinsèque
DE102022103147A1 (de) Verfahren und vorrichtung zum verfolgen eines objekts unter verwendung eines lidar-sensors und aufzeichnungsmedium, welches ein programm speichert, um das verfahren auszuführen
WO2022129266A1 (fr) Procédé de détection d&#39;au moins un objet d&#39;un environnement au moyen de signaux de réflexion d&#39;un système capteur radar
EP3871006A1 (fr) Système de détection de pluie comprenant un capteur d&#39;environnement servant à détecter point par point un environnement d&#39;un véhicule en particulier avec un capteur d&#39;environnement lidar
DE102006051091B4 (de) Verfahren zur Objekterkennung von Fahrzeugen mittels Nahbereichsdetektion
DE102019116100A1 (de) Verfahren zur Unterscheidung von Niederschlag und Gischt in einer auf Basis einer Punktwolke
DE102021109100A1 (de) Automatische Erkennung von Nässe auf einer Fahrbahnoberfläche
DE102021129070A1 (de) Inhaltliche Erfassung einer Umgebung eines Fahrzeugs mit a priori Konfidenzen
DE102021204985A1 (de) Verfahren und Vorrichtung zur Verarbeitung einer ein Umfeld repräsentierenden 3D-Punktwolke

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22734192

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22734192

Country of ref document: EP

Kind code of ref document: A1