WO2023131603A1 - Verfahren zur optimierung der umfeldwahrnehmung für ein fahrunterstützungssystem mittels zusätzlicher referenzsensorik - Google Patents
Verfahren zur optimierung der umfeldwahrnehmung für ein fahrunterstützungssystem mittels zusätzlicher referenzsensorik Download PDFInfo
- Publication number
- WO2023131603A1 WO2023131603A1 PCT/EP2023/050053 EP2023050053W WO2023131603A1 WO 2023131603 A1 WO2023131603 A1 WO 2023131603A1 EP 2023050053 W EP2023050053 W EP 2023050053W WO 2023131603 A1 WO2023131603 A1 WO 2023131603A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- sensor data
- environment model
- fusion
- sensors
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 77
- 230000004927 fusion Effects 0.000 claims abstract description 55
- 238000004590 computer program Methods 0.000 claims description 14
- 238000011156 evaluation Methods 0.000 claims description 10
- 230000008447 perception Effects 0.000 description 15
- 238000001514 detection method Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 9
- 239000010410 layer Substances 0.000 description 9
- 230000008901 benefit Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 238000005457 optimization Methods 0.000 description 5
- 230000006872 improvement Effects 0.000 description 4
- 238000002955 isolation Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 2
- 239000002346 layers by function Substances 0.000 description 2
- FGUUSXIOTUKUDN-IBGZPJMESA-N C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 Chemical compound C1(=CC=CC=C1)N1C2=C(NC([C@H](C1)NC=1OC(=NN=1)C1=CC=CC=C1)=O)C=CC=C2 FGUUSXIOTUKUDN-IBGZPJMESA-N 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/87—Combinations of radar systems, e.g. primary radar and secondary radar
- G01S13/876—Combination of several spaced transponders or reflectors of known location for determining the position of a receiver
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
- G01S7/4026—Antenna boresight
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4052—Means for monitoring or calibrating by simulation of echoes
- G01S7/4082—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
- G01S7/4091—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder during normal radar operation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
Definitions
- the present invention relates to a method for optimizing the perception of the surroundings for a driving support system with sensor data fusion using additional reference sensors.
- the present invention also relates to a control unit that is set up to carry out such a method.
- the present invention relates to a computer program which comprises instructions which, when the computer program is executed by a computer or a control unit, cause the latter to execute the method and also a machine-readable storage medium on which this computer program is stored.
- the software architecture of modern multi-sensor driver assistance systems typically consists of several layers that build on one another. These layers usually include the sensors, the perception of the environment, the situation analysis, the function and the control.
- the central requirement for safeguarding and optimizing performance ie the behavior of the software that can be experienced in real life traffic situations, is the availability of a meaningful amount of driving data. These are determined by means of endurance runs of a minimum length of several thousand kilometers, and the behavior of the vehicle is then analyzed in these measurement campaigns. The data annotated in this way is the basis for subsequent optimization runs. Protection and optimization of performance mostly only take place at the level of the overall system. The performance of the individual layers in isolation is usually not considered.
- DE 10 2018 123 735 A1 describes a method for improving an evaluation of an object detection of a radar device of a motor vehicle, the method having the following steps:
- a method for optimizing the environment perception for a driving support system with sensor data fusion is described, in particular by means of additional reference sensors, the method having the following method steps: a) using a sensor for environment perception as a reference sensor; b) creating a first environment model based on sensor data from the reference sensor; c) creating a second environment model based on sensor data from the sensor data fusion; and d) comparing the first environment model with the second environment model.
- a method of this type for perceiving the surroundings for a driving assistance system and thus processes of the driving assistance system based on the perception of the surroundings can be improved compared to solutions from the prior art, in particular using an additional reference sensor system.
- a fusion system or a sensor data fusion should in particular mean that sensor data from different sensors, in particular different sensor groups, are combined with one another in order to generate a common environment model.
- sensor groups should be understood to mean sensors of different types or sensors based on a different operating principle.
- SAE Level 1 and 2 usually has the following layers or levels, namely the sensors, the perception of the surroundings, the situation analysis, the function and the control. These layers expediently build on one another. Based on this, the method described can be particularly advantageous because the procedure of checking the performance only at the system level has potential for improvement. Since the performance of a deeper layer is not checked in isolation, there is a risk that specific improvement potential for this deeper layer will not be identified. The performance of the overall system thus remains at a local maximum, instead of reaching a global maximum through consistent performance optimization of the lower layers as well, as is possible according to the invention.
- sensor sets in current driver assistance systems are not limited to a single sensor, such as a radar sensor, but rather merge the data from at least one video camera and one or more radars in order to calculate a comprehensive environment model.
- Fusion systems are significantly more complex than systems that are limited to a single sensor: Each of the sensors involved has specific systematic strengths and weaknesses and a performance that must be individually evaluated and optimized. These are combined in non-trivial ways in the fusion.
- the present method takes into account that the performance at the level of environmental perception in fusion systems is influenced by several overlapping factors, namely:
- the method includes the following process steps.
- a sensor for perceiving the surroundings is used as a reference sensor.
- a sensor is used in a manner known per se in order to carry out a perception of the surroundings.
- the sensor is to be used as a reference sensor.
- an environment model is created, ie in particular a ground truth environment model, based on sensor data from the reference sensor.
- This step is basically known from the prior art and is used to detect the surroundings of a vehicle that is equipped with the driving support system and to examine possible objects. Accordingly, a perception of the surroundings can be a basis for further steps or processes of the driving support system.
- a second environment model is created based on sensor data from the sensor data fusion.
- an environment model is created using different sensors or sensor groups, with the sensors or sensor groups for creating the second environment model expediently being different from the sensor or sensor group for creating the first sensor model.
- first environment model and the second environment model After the first environment model and the second environment model have been created, they are compared with one another in accordance with method step d).
- the term second environment model is to be understood broadly and can, for example, include the overall model as well as parts of it, i.e. data from individuals in the Environment model of incoming sensor data include.
- data from the sensor data fusion it is preferable for data from the sensor data fusion to enter method step d) in at least one step.
- the focus of the method described here is therefore on the sub-area of perception of the surroundings using different sensors or sensor groups and in particular the sensor data fusion.
- an improvement in the performance of the sub-area of environmental perception can be achieved.
- local maxima can be prevented from being reached and it is possible to achieve both significantly better performance of the partial area of perception of the surroundings and improved overall system performance.
- a first environment model can be used as a reference and in the fusion model the sensor data of individual sensors or sensor groups, i.e. sensors of the same type or based on the same operating principle, can be considered and analyzed in a comparison.
- Individual components of the sensor data fusion and/or components of the sensor data fusion and/or the sensor data fusion as a whole can then be related to one another with the first environment model and compared to one another. In this way, an evaluation of individual sensor data can be made possible effectively and reliably and the result of the creation of an environment model can be improved.
- the method can preferably have the further method step: e) Evaluation of the sensor data from the sensor data fusion for accuracy.
- the environment model which is created from the sensor fusion and thus from the totality of the sensor data, can thus be evaluated. This thus allows the interaction of the sensors to be evaluated and the data directly entering further processes of a driving support system to be checked and the data to be corrected if necessary.
- the method has the further method step: f) Evaluation of sensor data from individual sensors or sensor groups as part of the sensor data fusion for accuracy.
- the method can have the further method step: g) Evaluation of the influence of sensor data from individual sensors or sensor groups as part of the sensor data fusion on the proportion of the sensor data fusion.
- the characteristic influence of an individual sensor or a sensor group on the entire fusion component can be determined and optimized in the same way for method step f) and in particular in combination with it.
- At least one of the method steps d) to g) can include the comparison of individual recognized objects in the environment models. In addition to evaluating an entire environment model, examining individual detected objects can deliver particularly precise results.
- At least one of method steps d) to g) can be carried out with reference to a specific driving situation.
- the driving situation can be recognized by appropriate sensors. This means that it is recognized, for example, whether a vehicle is in a parking process, is driving comparatively fast, is driving comparatively slowly, etc.
- an environment detection can possibly deliver different exact results depending on the driving situation and the sensor data can differ in particular from different sensors or sensor types depending on the driving situation. A particularly high-quality result is thus also made possible in this refinement, and driving support processes can be carried out particularly reliably.
- Data determined in at least one of the method steps d) to g) can preferably be used to improve the recognition of the surroundings and stored in a memory, for example.
- data that are determined are used to improve the recognition of the surroundings.
- this refinement can be based on an artificial intelligence algorithm in order to enable a particularly high quality of the result and thereby design driving support steps particularly reliably and precisely and therefore safely.
- a lidar sensor is used as the reference sensor in method step a).
- the use of a sensor based on lidar (light imaging, detection and ranging) technology can offer advantages in the context of the method described here.
- the lidar sensor offers the advantage of an almost to deliver complete environment field model with comparatively high accuracy.
- the optimization of the environment detection can also be particularly accurate or reliable, so that the result of the method described here is of particularly high quality.
- control unit is also described, the control unit being set up to carry out a method as described above.
- the control unit can be, for example, a vehicle-side control unit, a vehicle-external control unit or a vehicle-external server unit, such as a cloud system.
- the control device expediently includes a processor into which a computer program can be loaded, which includes instructions for executing the described method.
- a computer program which comprises instructions which, when the computer program is executed by a computer or a control device, cause the computer to carry out a method as described in detail above.
- the computer program can thus be loaded into the processor in order to execute the instructions for carrying out the method.
- control device the computer program and also the storage medium are thus used to improve recognition of the surroundings for a driver assistance system for a vehicle.
- control system the computer program and also the storage medium are thus used to improve recognition of the surroundings for a driver assistance system for a vehicle.
- FIG. 1 shows a schematic representation of different functional layers of a driving support system
- FIG. 2 shows a schematic representation of an embodiment of the method according to the present invention.
- FIG. 1 A typical software architecture of modern multi-sensor driver assistance systems of SAE levels 1 and 2 is shown in FIG. This typically includes several layers or levels that build on one another.
- the levels perform the following tasks, with the next higher level depending on the output of the previous level:
- the sensor 10 communicates directly with the respective sensor hardware, such as radar, camera, lidar, and receives the sensor-specific input data.
- the environment perception 12 describes the task of creating an environment model on the basis of input data from multiple sensors, potentially from different modalities.
- This environment model provides the basis for executing certain functions (see below) on the overall system level.
- Such an environment model includes, among other things, static and dynamic ones Objects such as road users, road signs, and road markings.
- the situation analysis 14 derives an understanding of the specific current traffic situation in which the ego vehicle is located from the current environment model and a time history.
- the function 16 implements the actual driver assistance functions that can actually be experienced in the vehicle, such as an automatic emergency brake or an adaptive cruise control.
- the control 18 converts the control commands of the functional layer into driving commands in the actuators.
- FIG. 2 shows a vehicle 20 which has a control unit 22 which can be part of a driving support system and with which a method for perceiving the surroundings for a driving support system with sensor data fusion can be implemented.
- the vehicle 20 can be operated in an assisted, partially automated, highly automated and/or fully automated or driverless manner.
- the vehicle or the control unit can have the software architecture shown in FIG.
- the vehicle 20 may be, for example, a passenger car, a truck, a robotaxi, and the like.
- the vehicle 20 is not generally limited to on-road operation. Rather, the vehicle 20 can also be designed as a watercraft, aircraft, such as a transport drone, and the like.
- the vehicle 20 has a plurality of sensors which acquire sensor data and which thus form a sensor fusion and provide data of a sensor data fusion.
- a lidar sensor 24, a video camera 26 and a plurality of radar sensors 28 are shown in detail.
- the sensors can be used to identify the surroundings by determining corresponding sensor data, by means of which objects 30 in the surroundings of the vehicle 20 can be recognized and the surroundings can thus be displayed.
- the sensor data can, for example, differ from each other depending on the function differ.
- virtual objects 32 are based on sensor data from lidar sensor 24
- virtual objects 34 are based on sensor data from video camera 26
- virtual objects 36 are based on sensor data from radar sensors 28
- virtual objects 38 are based on a sensor data fusion, ie based on all the sensor data from the video camera 26 and the radar sensors 28, is shown.
- Such a setup allows a method to be carried out with the method steps a) use of a sensor for perceiving the surroundings as a reference sensor; b) creating a first environment model based on sensor data from the reference sensor; c) creating a second environment model based on sensor data from the sensor data fusion; and d) comparing the first environment model with the second environment model.
- the lidar sensor 24 as part of a larger sensor set is not used for the actual signal processing for automated driving or driver assistance functions. Instead, it serves as a data supplier for an environment model (ground truth), which is recorded in all endurance runs in addition to the primary sensor data. This environment model is compared with the output of the actual signal processing, in particular the sensor data fusion.
- the ground truth can be compared both with the output of the fusion component and with the output of the individual sensors.
- the performance of the sensors can be evaluated individually and of the fusion component individually.
- the data from the sensors and the fusion component can be related to one another in order to concretely identify the specific strengths and weaknesses of the sensors and the fusion component
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202380026290.4A CN118871955A (zh) | 2022-01-10 | 2023-01-03 | 用于借助附加参考传感装置优化用于驾驶支持系统的环境感知的方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102022200139.7A DE102022200139A1 (de) | 2022-01-10 | 2022-01-10 | Verfahren zur Optimierung der Umfeldwahrnehmung für ein Fahrunterstützungssystem mittels zusätzlicher Referenzsensorik |
DE102022200139.7 | 2022-01-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023131603A1 true WO2023131603A1 (de) | 2023-07-13 |
Family
ID=84923188
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2023/050053 WO2023131603A1 (de) | 2022-01-10 | 2023-01-03 | Verfahren zur optimierung der umfeldwahrnehmung für ein fahrunterstützungssystem mittels zusätzlicher referenzsensorik |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN118871955A (de) |
DE (1) | DE102022200139A1 (de) |
WO (1) | WO2023131603A1 (de) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102018123735A1 (de) | 2018-09-26 | 2020-03-26 | HELLA GmbH & Co. KGaA | Verfahren und Vorrichtung zum Verbessern einer Objekterkennung eines Radargeräts |
-
2022
- 2022-01-10 DE DE102022200139.7A patent/DE102022200139A1/de active Pending
-
2023
- 2023-01-03 CN CN202380026290.4A patent/CN118871955A/zh active Pending
- 2023-01-03 WO PCT/EP2023/050053 patent/WO2023131603A1/de active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102018123735A1 (de) | 2018-09-26 | 2020-03-26 | HELLA GmbH & Co. KGaA | Verfahren und Vorrichtung zum Verbessern einer Objekterkennung eines Radargeräts |
Non-Patent Citations (2)
Title |
---|
SCHAERMANN ALEXANDER ET AL: "Validation of vehicle environment sensor models", 2017 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), IEEE, 11 June 2017 (2017-06-11), pages 405 - 411, XP033133785, DOI: 10.1109/IVS.2017.7995752 * |
WEI ZHOU ET AL: "Automated Evaluation of Semantic Segmentation Robustness for Autonomous Driving", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 24 October 2018 (2018-10-24), XP080927273 * |
Also Published As
Publication number | Publication date |
---|---|
DE102022200139A1 (de) | 2023-07-13 |
CN118871955A (zh) | 2024-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AT521607B1 (de) | Verfahren und Vorrichtung zum Testen eines Fahrerassistenzsystem | |
DE102008013366B4 (de) | Verfahren zur Bereitstellung von Information für Fahrerassistenzsysteme | |
DE102016212326A1 (de) | Verfahren zur Verarbeitung von Sensordaten für eine Position und/oder Orientierung eines Fahrzeugs | |
DE102018126270A1 (de) | Dezentralisierte fahrzeugsteuerung der minimalen risikobedingung | |
DE102018215753A1 (de) | Vorrichtung und Verfahren zum Ermitteln einer Trajektorie eines Fahrzeugs | |
DE102020206755A1 (de) | Redundanzinformationen für objektschnittstelle für hoch und vollautomatisiertes fahren | |
DE102019115330A1 (de) | Echtzeit-Sicherheitspfaderzeugung für hochautomatisiertes Fahrzeugrückfallmanöver | |
DE102017201796A1 (de) | Steuervorrichtung zum Ermitteln einer Eigenbewegung eines Kraftfahrzeugs sowie Kraftfahrzeug und Verfahren zum Bereitstellen der Steuervorrichtung | |
DE102020214596A1 (de) | Verfahren zum Erzeugen von Trainingsdaten für ein Erkennungsmodell zum Erkennen von Objekten in Sensordaten einer Umfeldsensorik eines Fahrzeugs, Verfahren zum Erzeugen eines solchen Erkennungsmodells und Verfahren zum Ansteuern einer Aktorik eines Fahrzeugs | |
DE102016202317A1 (de) | Verfahren zum steuern von fahrzeugfunktionen durch ein fahrerassistenzsystem, fahrerassistenzsystem und fahrzeug | |
WO2023131603A1 (de) | Verfahren zur optimierung der umfeldwahrnehmung für ein fahrunterstützungssystem mittels zusätzlicher referenzsensorik | |
DE102019211006B4 (de) | Auswerten von Sensordaten eines Fahrzeugs | |
DE102017223264A1 (de) | Verfahren und Vorrichtung zum Ansteuern eines Aktors | |
DE102021111724A1 (de) | Verfahren und Computerprogramm zum Evaluieren eines Softwarestands eines Fahrerassistenzsystems | |
DE102021101717A1 (de) | Verfahren zum Bereitstellen von fusionierten Daten, Assistenzsystem und Kraftfahrzeug | |
DE102016003116A1 (de) | Verfahren zum Betreiben eines Fahrwerks eines Kraftfahrzeugs | |
DE102019127322A1 (de) | Verfahren zur Erfassung von Objekten in einer Fahrzeugumgebung, Vorrichtung zur Datenverarbeitung, Computerprogrammprodukt und computerlesbarer Datenträger | |
DE102019218476A1 (de) | Vorrichtung und Verfahren zum Messen, Simulieren, Labeln und zur Bewertung von Komponenten und Systemen von Fahrzeugen | |
DE102019220607A1 (de) | Verwendung von ultraschallbasierten Subsystemen zur 360° Umfelderfassung | |
WO2022199916A1 (de) | Verfahren zum bewerten einer software für ein steuergerät eines fahrzeugs | |
DE102021209670A1 (de) | Konzept zum Überwachen einer Datenfusionsfunktion eines Infrastruktursystems | |
DE202021004237U1 (de) | Computerlesbares Speichermedium und Rechenvorrichtung zum Evaluieren eines Softwarestands eines Fahrerassistenzsystems | |
DE102022206603A1 (de) | Verfahren zur Handdetektion, Computerprogramm, und Vorrichtung | |
WO2024008623A1 (de) | Computerimplementiertes verfahren und steuergerät zum bestimmen eines geforderten sicherheitsintegritätsniveaus sicherheitsbezogener fahrzeugfunktionen | |
WO2024013162A1 (de) | Verfahren zum bestimmen von unzulänglichkeiten bei der detektion von objekten in von einem umgebungssensor eines kraftfahrzeuges erfassten daten |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23700058 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18727362 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202380026290.4 Country of ref document: CN |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 23700058 Country of ref document: EP Kind code of ref document: A1 |