WO2020020654A1 - Procédé pour faire fonctionner un système d'aide à la coduite doté deux dispositifs de détection - Google Patents
Procédé pour faire fonctionner un système d'aide à la coduite doté deux dispositifs de détection Download PDFInfo
- Publication number
- WO2020020654A1 WO2020020654A1 PCT/EP2019/068710 EP2019068710W WO2020020654A1 WO 2020020654 A1 WO2020020654 A1 WO 2020020654A1 EP 2019068710 W EP2019068710 W EP 2019068710W WO 2020020654 A1 WO2020020654 A1 WO 2020020654A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- detection
- detection device
- motor vehicle
- driver assistance
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/86—Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9318—Controlling the steering
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9324—Alternative operation using ultrasonic waves
Definitions
- the invention relates to a method for operating a driver assistance system of a motor vehicle with a first detection device, with a second
- Detection device and with a central electronic computing device.
- a first detection type becomes a first
- Detection device detects a second piece of information about the environment in a second detection type different from the first detection device and evaluates the first information and the second information by means of the electronic computing device.
- DE 10 2009 009 047 A1 discloses a method for object detection.
- object detection two images of an environment are recorded and a disparity image is determined by means of stereo image processing, a depth map of the environment being determined from the ascertained disparities, in which a free space boundary line is identified that delimits an unobstructed area of the environment.
- the depth map is segmented outside and along the free space boundary line by forming segments of a suitable width from pixels of the same or similar distance to an image plane, with a height of each segment being protected as part of an object outside the unobstructed area, so that each segment is protected by the two-dimensional position of its base and its height is characterized.
- the object of the present invention is to improve the operation of a
- One aspect of the invention relates to a method for operating a
- Driver assistance system of a motor vehicle with a first detection device, with a second detection device and with a central electronic
- Type of detection acquires a first piece of information about the surroundings of the motor vehicle, and by means of the second detection device, a second piece of information about the surroundings is recorded in a second detection type different from the first detection type, and the first information and the second information are evaluated by means of the electronic computing device.
- the first information and the second information are merged and the merged information is combined as a function of a decision criterion into a data packet and the data packet is transmitted to the electronic computing device for evaluation.
- the first information and the second information each raw data of the first detection device and the second
- Detection device are. This raw data is then merged accordingly. The associated data is then merged, which is also referred to as clustering, depending on the decision criterion.
- the amount of data is greatly reduced by merging the first information and the second information.
- a plurality of detection devices for merging or merging are combined, on the one hand to maintain the diversity of the detection devices of the driver assistance system or a highly automated system, which increases the stability of the merging, and on the other hand to further reduce the amount of data.
- the type of detection is to be understood in particular as how the detection device detects the surroundings.
- a lidar sensor can use laser beams to record the surroundings.
- a camera can be used to record the surroundings.
- Ultrasonic sensor by sending and receiving ultrasonic signals Capture the environment.
- a radar sensor can detect the surroundings by transmitting and receiving radar signals.
- the merged information is combined with further information from another sensor of the motor vehicle
- Detection devices act in order to be able to generate information about, for example, a target object in the surroundings of the motor vehicle. It is also possible that, for example, a point cloud with additional
- This additional information such as a potential object type and / or a detection probability and / or a surface condition, provides valuable additional information without, however, restricting the level of detail of the data packet as much as is the case with an object interface according to the prior art would. As a result, the driver assistance system can be operated in a data-reduced manner while still being highly functional.
- the characterizing property can be, for example, a property relating to the
- the surroundings are used as the first detection device by means of a camera and as the second by means of a lidar sensor
- Detection device detected This makes it based on two different types of data entry allows to capture the environment. As a result, the surroundings can be reliably detected using different types of detection. It is also possible that, for example, a radar sensor or a as the first and / or second detection device
- Ultrasonic sensor is provided, by means of which the environment of the
- Motor vehicle can be detected.
- the information obtained in each case is then combined using the camera and / or the lidar sensor and forwarded to the electronic computing device as a data packet.
- the first information and the second information are merged on the basis of a distance of the detected environment from the motor vehicle as a decision criterion, in particular on the basis of a vertical extension.
- the data are summarized accordingly. For example, an object detection can thereby be carried out reliably since the information relating to the object is fused.
- the environment for the driver assistance system can be reliably detected in particular by the embodiment with the vertical extension.
- Driver assistance system carried out an at least partially autonomous, in particular fully autonomous, ferry operation of the motor vehicle.
- the driver assistance system is in particular a highly automated one
- the driver assistance system can also be referred to as a driver support system or vehicle guidance system.
- the surroundings can be recorded in real time, and thus reliably
- Information regarding the environment can be evaluated in the electronic computing device. In this way, particularly fast response times of the driver assistance system can be realized, which in particular increases safety in road traffic.
- the single figure shows a schematic view of a motor vehicle with an embodiment of a driver assistance system.
- the figure shows a motor vehicle 10 with an embodiment of a
- the driver assistance system 12 has a first
- Detection device 14 and a second detection device 16 A first piece of information 22 of an environment 18 of the motor vehicle 10 can be recorded by means of the first detection device 14 and a second piece of information 24 of the environment 18 is recorded with the second detection device. Using the first detection device 14, the environment 18 is switched to a first detection type and by means of the second
- the detection device 16 detects the environment 18 in a second detection mode.
- the driver assistance system 12 also has a central electronic
- Computing device 20 by means of which the first information 22 and the second information 24 are evaluated.
- first information 22 and the second information 24 are merged and the merged information 22, 24 as a function of one
- Decision criterion are combined into a data packet 26 and the data packet 26 is transmitted to the electronic computing device 20 for evaluation.
- the first information 22 and the second information 24 are fused by means of a fusion device 28.
- the first information 22 and the second information 24 are each raw data from the first detection device 14 and the second detection device 16. This raw data is then merged accordingly. The associated data is then merged, which is also referred to as clustering, depending on the decision criterion. The amount of data is greatly reduced by merging the first information 22 and the second information 24. Furthermore, several detection devices 14, 16 for merging or merging, on the one hand to maintain the diversity of the detection devices 14, 16 of the driver assistance system 12 or a highly automated system, which improves the stability of the
- sensors of the motor vehicle 10 can also be more robust to sensor-specific misconduct. This enables that
- Driver assistance system 12 overall a higher performance and an increased
- the type of detection is to be understood in particular as how the detection device 14, 16 detects the surroundings 18.
- a lidar sensor can detect the surroundings 18 using laser beams as the type of detection.
- a camera can capture the surroundings 18 by taking pictures.
- a lidar sensor can detect the surroundings 18 using laser beams as the type of detection.
- a camera can capture the surroundings 18 by taking pictures.
- Ultrasonic sensor by sending and receiving ultrasonic signals
- Capture surroundings 18 For example, a radar sensor can detect the surroundings 18 by transmitting and receiving radar signals.
- the merged information 22, 24 is combined with further information 30 from a further sensor 32 of the motor vehicle 10 to form the data packet 26 and the data packet 26 is transmitted to the electronic computing device 20 for evaluation.
- the additional information 30 can be information 22, 24, 30 from surrounding data from a plurality of detection devices 14, 16, in particular sensors, in order to generate information 22, 24, 30 about, for example, a target object in the surroundings 18 of the motor vehicle 10 to be able to. It is also possible that, for example, a point cloud can be generated with additional information.
- This further information 30, such as a potential object type and / or a
- Detection probability and / or a surface quality are provided with valuable additional information, but without the level of detail of the To restrict data packets 26 as much as would be the case with an object interface according to the prior art. This can reduce data and still
- the driver assistance system 12 is operated in a highly functional manner.
- a characterizing property of the environment 18 is specified by means of the further information 30.
- the characterizing property can be, for example, a property relating to the nature of a road in the surroundings 18 of the motor vehicle 10 or, for example, the estimated classification of an object
- This additional information 30 can then in particular be appended to the information 22, 24 that has already been merged and can be passed on to the central electronic computing device 20 by the merged sensor. This enables a potential object type, a
- Detection probability, a surface condition or the like are also connected to the data packet 26, as a result of which additional information can be made available without restricting the level of detail of the data as much as would be the case with object interfaces according to the prior art.
- the surroundings 18 are recorded by means of a camera as the first detection device 14 and by means of a lidar sensor as the second detection device 16. It is also possible that the first
- Detection device 14 and / or the second detection device 16 can be designed as a radar sensor or as an ultrasonic sensor.
- the first information 22 and the second information 24 are merged on the basis of a distance of the detected environment 18 from the motor vehicle 10 as a decision criterion, in particular on the basis of a vertical extension.
- the data are summarized accordingly.
- an object detection can thereby be carried out reliably since the information 22, 24 relating to the object is merged.
- the environment 18 for the driver assistance system 12 can be reliably detected in particular by the configuration with the vertical extension.
- the driver assistance system 12 enables at least partially autonomous, in particular fully autonomous, driving operation of the
- Driver assistance system 12 in particular a highly automated one
- driver assistance system 12 can also act as a driver assistance system or
- Vehicle guidance system are called.
- the surroundings 18 can be detected in particular in real time and thus the information 22, 24, 30 relating to the surroundings 18 can be reliably evaluated in the electronic computing device 20. This allows
- fast response times of the driver assistance system 12 can be realized, which in particular increases safety in road traffic.
- the invention shows a method for multimodal vertical clustering of raw data with enriched information from several sensor types.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Acoustics & Sound (AREA)
- Traffic Control Systems (AREA)
Abstract
L'invention concerne un procédé pour faire fonctionner un système d'aide à la conduite (12) d'un véhicule automobile (10), ayant un premier dispositif de détection (14), un deuxième dispositif de détection (16) et un dispositif de calcul (20) central électronique. Au moyen du premier dispositif de détection (14), une première information (22) d'un environnement (18) du véhicule automobile (10) est détectée dans un premier mode de détection et, au moyen du deuxième dispositif de détection (16), une deuxième information (24) de l'environnement (18) est détectée dans un deuxième mode de détection différent du premier mode de détection, et la première information (22) et la deuxième information (24) sont évaluées au moyen du dispositif de calcul (20) électronique. La première information (22) et la seconde information (24) sont fusionnées et l'information fusionnée (22, 24) est combinée pour former un paquet de données (26) en fonction d'un critère de décision, et le paquet de données (26) est transmis au dispositif de calcul (20) électronique pour évaluation.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018005969.4A DE102018005969A1 (de) | 2018-07-27 | 2018-07-27 | Verfahren zum Betreiben eines Fahrerassistenzsvstems mit zwei Erfassungseinrichtungen |
DE102018005969.4 | 2018-07-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020020654A1 true WO2020020654A1 (fr) | 2020-01-30 |
Family
ID=67262324
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2019/068710 WO2020020654A1 (fr) | 2018-07-27 | 2019-07-11 | Procédé pour faire fonctionner un système d'aide à la coduite doté deux dispositifs de détection |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102018005969A1 (fr) |
WO (1) | WO2020020654A1 (fr) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022108745A1 (fr) * | 2020-11-23 | 2022-05-27 | Argo AI, LLC | Systèmes et procédés de détection d'objet avec une décorrélation lidar |
US11430224B2 (en) | 2020-10-23 | 2022-08-30 | Argo AI, LLC | Systems and methods for camera-LiDAR fused object detection with segment filtering |
US11885886B2 (en) | 2020-10-23 | 2024-01-30 | Ford Global Technologies, Llc | Systems and methods for camera-LiDAR fused object detection with LiDAR-to-image detection matching |
US12050273B2 (en) | 2020-10-23 | 2024-07-30 | Ford Global Technologies, Llc | Systems and methods for camera-LiDAR fused object detection with point pruning |
US12122428B2 (en) | 2020-10-23 | 2024-10-22 | Ford Global Technologies, Llc | Systems and methods for camera-LiDAR fused object detection with segment merging |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102009009047A1 (de) | 2009-02-16 | 2010-08-19 | Daimler Ag | Verfahren zur Objektdetektion |
US8886394B2 (en) * | 2009-12-17 | 2014-11-11 | Bae Systems Plc | Producing data describing states of a plurality of targets |
-
2018
- 2018-07-27 DE DE102018005969.4A patent/DE102018005969A1/de not_active Withdrawn
-
2019
- 2019-07-11 WO PCT/EP2019/068710 patent/WO2020020654A1/fr active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102009009047A1 (de) | 2009-02-16 | 2010-08-19 | Daimler Ag | Verfahren zur Objektdetektion |
US8886394B2 (en) * | 2009-12-17 | 2014-11-11 | Bae Systems Plc | Producing data describing states of a plurality of targets |
Non-Patent Citations (2)
Title |
---|
GROSSMANN P: "MULTISENSOR DATA FUSION", GEC JOURNAL OF RESEARCH, GEC MARCONI RESEARCH CENTER. GREAT BADDOW CHEMSFORD, GB, vol. 15, no. 1, 1 January 1998 (1998-01-01), pages 27 - 37, XP000832810, ISSN: 0264-9187 * |
ULLRICH SCHEUNERT ET AL: "Early and Multi Level Fusion for Reliable Automotive Safety Systems", INTELLIGENT VEHICLES SYMPOSIUM, 2007 IEEE, IEEE, PI, 1 June 2007 (2007-06-01), pages 196 - 201, XP031126944, ISBN: 978-1-4244-1067-5 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11430224B2 (en) | 2020-10-23 | 2022-08-30 | Argo AI, LLC | Systems and methods for camera-LiDAR fused object detection with segment filtering |
US11885886B2 (en) | 2020-10-23 | 2024-01-30 | Ford Global Technologies, Llc | Systems and methods for camera-LiDAR fused object detection with LiDAR-to-image detection matching |
US12050273B2 (en) | 2020-10-23 | 2024-07-30 | Ford Global Technologies, Llc | Systems and methods for camera-LiDAR fused object detection with point pruning |
US12122428B2 (en) | 2020-10-23 | 2024-10-22 | Ford Global Technologies, Llc | Systems and methods for camera-LiDAR fused object detection with segment merging |
WO2022108745A1 (fr) * | 2020-11-23 | 2022-05-27 | Argo AI, LLC | Systèmes et procédés de détection d'objet avec une décorrélation lidar |
US12118732B2 (en) | 2020-11-23 | 2024-10-15 | Ford Global Technologies, Llc | Systems and methods for object detection with LiDAR decorrelation |
Also Published As
Publication number | Publication date |
---|---|
DE102018005969A1 (de) | 2020-01-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102017203838B4 (de) | Verfahren und System zur Umfelderfassung | |
WO2020020654A1 (fr) | Procédé pour faire fonctionner un système d'aide à la coduite doté deux dispositifs de détection | |
EP2394234B1 (fr) | Procédé et dispositif de détermination d'un marquage de voie de circulation en vigueur | |
DE102018218220A1 (de) | Steuergerät für ein Fahrzeug | |
EP2150939B1 (fr) | Procédé et dispositif pour déterminer la position d'un panneau de signalisation | |
DE102015220252A1 (de) | Spurhalte-Unterstützungssystem und Verfahren für dieses | |
DE102016219455A1 (de) | Kraftfahrzeug, Verfahren und aktive Infrastruktur zur Überprüfung einer Umfelderfassung eines Kraftfahrzeugs | |
WO2015032709A1 (fr) | Procédé et dispositif de surveillance de la circulation routière pour la constatation d'une circulation dans le mauvais sens d'un véhicule automobile | |
EP2040220B1 (fr) | Procédé destiné à la détection d'obstacles | |
EP2033165B1 (fr) | Procede pour la detection d'un espace de circulation | |
EP3044727B1 (fr) | Procédé et dispositif de détection d'objets d'après des données d'image ayant une résolution de profondeur | |
DE102018121008A1 (de) | Kreuzverkehrserfassung unter verwendung von kameras | |
DE102016216070A1 (de) | Steuergerät, System mit solch einem Steuergerät und Verfahren zum Betrieb solch eines Systems | |
DE102016012345A1 (de) | Verfahren zum Erkennen von Objekten | |
DE102008025773A1 (de) | Verfahren zur Schätzung eines Orts- und Bewegungszustands eines beobachteten Objekts | |
DE102013021840A1 (de) | Verfahren zum Erzeugen eines Umgebungsmodells eines Kraftfahrzeugs, Fahrerassistenzsystem und Kraftfahrzeug | |
DE202016001002U1 (de) | System zur Erkennung einer Richtungsfahrbahn, Kraftfahrzeug sowie Computerprogrammprodukt | |
EP3655299B1 (fr) | Procédé et dispositif de détermination d'un flux optique à l'aide d'une séquence d'images enregistrée par une caméra d'un véhicule | |
DE102017211887A1 (de) | Verfahren und Vorrichtung zum Lokalisieren und automatisierten Betreiben eines Fahrzeugs | |
DE102019102672A1 (de) | Intersensorisches lernen | |
DE102019218078B4 (de) | Bestimmung eines Sachverhalts im Umfeld eines Kraftfahrzeugs | |
DE102017100669A1 (de) | Verfahren zum Erfassen eines Umgebungsbereichs eines Kraftfahrzeugs mit Anpassung einer Region von Interesse in Abhängigkeit von einem Anhänger, Recheneinrichtung, Kamerasystem sowie Kraftfahrzeug | |
EP4053593A1 (fr) | Traitement des données de capteur dans un moyen de transport | |
WO2021180679A1 (fr) | Détermination d'une zone de focalisation actuelle d'une image de caméra sur la base de la position de la caméra de véhicule sur le véhicule et sur la base d'un paramètre de mouvement actuel | |
DE102014214713A1 (de) | Verfahren zum Betrieb eines Assistenzsystems eines Kraftfahrzeugs sowie Assistenzsystem |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19739592 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19739592 Country of ref document: EP Kind code of ref document: A1 |