EP4352709A1 - Procédé de détection d'objets pertinents pour la sécurité d'un véhicule - Google Patents

Procédé de détection d'objets pertinents pour la sécurité d'un véhicule

Info

Publication number
EP4352709A1
EP4352709A1 EP22724022.3A EP22724022A EP4352709A1 EP 4352709 A1 EP4352709 A1 EP 4352709A1 EP 22724022 A EP22724022 A EP 22724022A EP 4352709 A1 EP4352709 A1 EP 4352709A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
relevant
classified
safety
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22724022.3A
Other languages
German (de)
English (en)
Inventor
Stefan Studer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Mercedes Benz Group AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mercedes Benz Group AG filed Critical Mercedes Benz Group AG
Publication of EP4352709A1 publication Critical patent/EP4352709A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Definitions

  • the invention relates to a method for detecting objects in the surroundings of the vehicle that are relevant to safety for a vehicle, with the surroundings and objects located in the surroundings being detected using detected signals from an environment sensor system of the vehicle.
  • DE 102019 004075 A1 discloses a method for determining a relevance of an object in an area surrounding a motor vehicle for the motor vehicle using a driver assistance system of the motor vehicle and such a driver assistance system.
  • the method provides that the object detected with a detection device of the driver assistance system in the area surrounding the motor vehicle is compared with an object stored in a memory device of the driver assistance system using an electronic computing device of the driver assistance system and the object is identified by the electronic computing device as a function of the comparison. This is done by means of a
  • Line of sight detection device of the driver assistance system detects a line of sight of a driver of the motor vehicle and depending on the detected line of sight, the relevance for the marked object is determined.
  • the invention is based on the object of specifying a method for detecting objects that are safety-relevant for a vehicle.
  • a method for detecting safety-relevant objects for a vehicle, in particular traffic routes, landmarks and places of interest, in the surroundings of the vehicle provides that the surroundings and objects located in them are detected using detected signals from an environment sensor system of the vehicle.
  • a respective object which was recognized by means of an object recognition device of the vehicle and initially classified as not relevant for the vehicle, is classified as potentially relevant if it is recognized by means of a vehicle-side gaze detection device that a gaze of at least one vehicle occupant of the vehicle, in particular all vehicle occupants of the Vehicle stays longer than a predetermined minimum viewing time on the respective object.
  • a respective moving object classified as potentially relevant is being tracked by the at least one vehicle occupant, in particular by all vehicle occupants, for at least a specified period of time, with a trajectory of the respective detected, moved and classified as potentially relevant for the vehicle is identified as safety-relevant for the vehicle after the specified period of time has elapsed if it is necessary for the vehicle to react to the respective object, for example by means of a steering movement and/or braking or Acceleration by the driver or by a driver assistance system.
  • safety can be increased by paying attention to objects on safety-relevant trajectories, in particular with regard to a function of a driver assistance system.
  • the relevance of the respective object relates to a driving task of the vehicle.
  • objects can be identified as safety-relevant for the vehicle in a simple manner on the basis of identified safety-relevant trajectories.
  • objects are recognized as safety-relevant when they are located on trajectories that were identified as safety-relevant at an earlier point in time.
  • the respective trajectory of a moving object identified as safety-relevant is stored as safety-relevant traffic routing in a digital map of the vehicle and/or in a computer unit that is data-linked to the vehicle.
  • safety-relevant traffic routing is also aggregated over a comparatively large number of journeys in the digital map, in that whenever an object is classified as safety-relevant, the trajectory is also included Location information noted in the digital map, for example, is drawn. The relevance of the traffic routing is therefore confirmed several times before it is stored as such in the digital map, before the traffic routing is stored in the digital map as relevant to safety.
  • the digital map stored in the vehicle and/or in the computer unit is made available to other vehicles. Trajectories are made available to the driver assistance system, so that objects on these trajectories can be viewed as relevant for a driving task of the vehicle and the other vehicles in the future.
  • an object detected at a later point in time is classified as relevant depending on its presence on one of the stored safety-relevant traffic routings. If the object detected later is on a trajectory that was previously determined to be safety-relevant, then this object represents a safety-relevant moving object for the vehicle.
  • one possible embodiment of the method provides that, in order to identify landmarks and/or locations of interest, a respective stationary object is classified as potentially relevant by means of a model, with the respective stationary object classified as potentially relevant being displayed to at least one vehicle occupant who classifies each stationary object classified as potentially relevant as a landmark and/or as a place of interest. Provision is therefore made for a distinction to be made between landmarks and places of interest, with the landmarks being used for navigation and a driving experience being able to be enhanced by means of places of interest.
  • the stationary object classified as potentially relevant is stored in the digital map as a landmark and/or as a place of interest.
  • the respective landmark and/or the respective place of interest is displayed in the vehicle and can be used for navigation, for example during a rally drive, and to enhance the driving experience, for example by driving to the place of interest.
  • 1 shows a diagram of a roadway section with two lanes running in opposite directions and a vehicle traveling in a right-hand lane.
  • the only figure shows a roadway section F with two opposing lanes F1, F2.
  • a vehicle 1 is driving in a right-hand lane F1, ahead of which a first vehicle 2 is driving and which a second vehicle 3 is driving towards in a left-hand lane F2.
  • a driveway F3 leads into the right-hand lane F1, with a third vehicle 4 as the first object 01 intending to drive into the right-hand lane F1.
  • the vehicle 1 has an environment sensor system 1.1, which includes a number of detection units, not shown in detail, which are arranged in and/or on the vehicle 1.
  • the detection units are designed as lidar-based, radar-based, ultrasound-based sensors and/or as a multifunction camera.
  • Surrounding sensor system 1.1 has a field of view S1, with another field of view S2 of vehicle occupants of vehicle 1 almost corresponding to field of view S1 of surrounding sensor system 1.1 or a subset thereof. In particular, there are two vehicle occupants in vehicle 1.
  • the surroundings of the vehicle 1 and the objects 01 to 04 located therein are detected on the basis of detected signals from the surroundings sensor system 1.1, with a large number of objects 01 to 04 being recognized by means of an object recognition device 1.2 of the vehicle 1.
  • the problem can arise that without context information on the individual recognized objects 01 to 04, a relevance of these for the vehicle 1 is not or cannot be adequately assessed.
  • the relevance relates both to a driving task and to objects 03, 04 for navigation and/or as locations of interest.
  • all recognized objects 01 to 04 are selected that are classified by a driver assistance system of vehicle 1 (not shown in detail) as not relevant to a driving task. These objects 01 to 04 are initially classified as potentially relevant.
  • the driver assistance system includes a satellite-based position determination unit, navigation data, the environment sensors 1.1, the object recognition device 1.2 and an interior camera 1.3 with a gaze detection device 1.4.
  • the driver assistance system usually selects objects 01 to 04 and discards others, for example in order to reduce computing time. In one possible embodiment, it can be a driver assistance system for automated, for example highly automated, ferry operation of the vehicle 1 .
  • these objects 01 to 04 are classified as potentially relevant if a vehicle-mounted gaze detection device 1.4 detects that at a point in time h of a predetermined period of time the vehicle occupants of vehicle 1 are looking at the respective object 01 to 04 for longer than a predetermined minimum gaze duration.
  • a vehicle-mounted gaze detection device 1.4 detects that at a point in time h of a predetermined period of time the vehicle occupants of vehicle 1 are looking at the respective object 01 to 04 for longer than a predetermined minimum gaze duration.
  • the prerequisite for this is that the field of view S1 of the environmental sensor system 1.1 essentially corresponds to the further field of view S2 of the vehicle occupants.
  • the tractor 5 can drive as the second object 02 coming from the dirt road F4 onto the roadway section F and can become safety-relevant for the vehicle 1 at a further point in time t2 at the end of the specified period of time.
  • the third vehicle 4 as a moving first object 01 on the ramp F3 can also be relevant for the vehicle 1 at the further point in time t2.
  • a trajectory T of the respective detected, moving object 01, 02 classified as potentially relevant for the vehicle 1 is identified as safety-relevant for the vehicle 1 after the specified period of time has elapsed if based on the trajectory T of the third vehicle 4 and/or the tractor 5 it is necessary for the vehicle 1 to react to the respective object 01, 02, for example by means of a steering movement and/or braking or acceleration by the driver or by a driver assistance system.
  • the trajectories T of the moving objects 01, 02 identified as safety-relevant are stored as safety-relevant traffic management in a digital map after multiple confirmation of the relevance for the driving task, for example by other vehicles of a vehicle fleet not shown in detail, to which the vehicle 1 belongs.
  • This digital map is stored in the vehicle 1 and/or in a computer unit which is coupled to the vehicle 1 in terms of data technology and is not shown in detail.
  • the digital map with the stored safety-relevant traffic routes can be made available to other vehicles via vehicle 1 and/or via the computer unit.
  • the provision takes place in particular when the respective additional vehicle is on the corresponding road section F.
  • the safety-relevant traffic routing is made available to the driver assistance system.
  • the method provides that, by means of a stored model, in particular a machine learning model, the respective object 03, 04 detected by the object detection device 1.2 is recognized as a stationary object 03, 04 in the form of a Palm tree 6 and a lookout tower 7 is classified.
  • the respective object 03, 04 is then displayed to the vehicle occupants of the vehicle 1, in particular on a display unit, for example in the area of an instrument panel.
  • the vehicle occupant or occupants now have the option of classifying the respective object 03, 04 as a landmark and/or as a place of interest. For example, a question about this is displayed on the display unit.
  • a landmark is particularly important for navigating the vehicle 1, e.g. B. for a rally drive, in which navigation is made by means of landmarks, relevant.
  • the landmarks and locations of interest are stored in the digital map and are therefore available to the vehicle 1 and possibly other vehicles, at least from the same vehicle fleet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé de détection d'objets (01 à 04) qui sont pertinents pour la sécurité d'un véhicule (1) dans l'environnement du véhicule (1). L'environnement et les objets situés dans l'environnement sont détectés au moyen de signaux détectés d'un système de capteurs d'environnement (1.1) du véhicule (1). Selon l'invention, - chaque objet (01 à 04) qui a été détecté par un dispositif de détection d'objet (1.2) du véhicule (1) et qui a été initialement classé comme n'étant pas pertinent pour le véhicule (1) est classé comme étant potentiellement pertinent si un dispositif de détection de regard côté véhicule (1.4) détecte que le regard d'au moins un occupant du véhicule (1) reste sur l'objet respectif (01 à 04) pendant une durée supérieure à une durée de regard minimale prédéfinie, - une direction de regard détectée du ou des occupants du véhicule est utilisée pour déterminer qu'un objet mobile respectif (01, 02) classé comme étant potentiellement pertinent a été suivi par le ou les occupants du véhicule pendant au moins une durée spécifiée, et la trajectoire (T) de l'objet mobile détecté respectif (01, 02) classé comme étant potentiellement pertinent pour le véhicule (1) est identifié comme étant pertinent pour la sécurité du véhicule (1) après expiration de la durée spécifiée s'il est nécessaire que le véhicule (1) réagisse avec l'objet respectif (01, 02).
EP22724022.3A 2021-06-07 2022-04-20 Procédé de détection d'objets pertinents pour la sécurité d'un véhicule Pending EP4352709A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021002918.6A DE102021002918B4 (de) 2021-06-07 2021-06-07 Verfahren zur Erkennung von für ein Fahrzeug sicherheitsrelevanten Objekten
PCT/EP2022/060418 WO2022258250A1 (fr) 2021-06-07 2022-04-20 Procédé de détection d'objets pertinents pour la sécurité d'un véhicule

Publications (1)

Publication Number Publication Date
EP4352709A1 true EP4352709A1 (fr) 2024-04-17

Family

ID=81748412

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22724022.3A Pending EP4352709A1 (fr) 2021-06-07 2022-04-20 Procédé de détection d'objets pertinents pour la sécurité d'un véhicule

Country Status (5)

Country Link
EP (1) EP4352709A1 (fr)
KR (1) KR20240004883A (fr)
CN (1) CN117413302A (fr)
DE (1) DE102021002918B4 (fr)
WO (1) WO2022258250A1 (fr)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE529304C2 (sv) * 2005-09-06 2007-06-26 Gm Global Tech Operations Inc Metod och system för förbättrande av trafiksäkerhet
DE102016201939A1 (de) 2016-02-09 2017-08-10 Volkswagen Aktiengesellschaft Vorrichtung, Verfahren und Computerprogramm zur Verbesserung der Wahrnehmung bei Kollisionsvermeidungssystemen
DE102017207960A1 (de) 2017-05-11 2018-11-15 Volkswagen Aktiengesellschaft Verfahren und vorrichtung zur ortsaufgelösten detektion von einem fahrzeugexternen objekt mithilfe eines in einem fahrzeug verbauten sensors
US20180339730A1 (en) * 2017-05-26 2018-11-29 Dura Operating, Llc Method and system for generating a wide-area perception scene graph
DE102017221202B3 (de) 2017-11-27 2018-12-27 Audi Ag Fahrzeugsteuersystem für autonom fahrendes Kraftfahrzeug
DE102018219125A1 (de) 2018-11-09 2020-05-14 Volkswagen Aktiengesellschaft Verfahren zum Klassifizieren von Objekten mittels eines automatisiert fahrenden Kraftfahrzeuges und automatisiert fahrendes Kraftfahrzeug
WO2020157135A1 (fr) * 2019-01-31 2020-08-06 Robert Bosch Gmbh Système de perception environnementale pour un véhicule
DE102019004075A1 (de) 2019-06-08 2020-01-02 Daimler Ag Verfahren zum Bestimmen einer Relevanz eines Objekts in einer Umgebung eines Kraftfahrzeugs mittels eines Fahrerassistenzsystems sowie Fahrerassistenzsystem

Also Published As

Publication number Publication date
CN117413302A (zh) 2024-01-16
WO2022258250A1 (fr) 2022-12-15
KR20240004883A (ko) 2024-01-11
DE102021002918A1 (de) 2022-12-08
DE102021002918B4 (de) 2023-04-06

Similar Documents

Publication Publication Date Title
DE102015207123B4 (de) Fahrassistenzvorrichtung und -verfahren
DE102019211681B4 (de) Verfahren eines Fahrzeugs zum automatisierten Parken
DE102018114808A1 (de) Verfahren zur automatischen Querführung eines Folgefahrzeugs in einem Fahrzeug-Platoon
WO2011157251A1 (fr) Procédé pour fusionner un système de reconnaissance d'un panneau de signalisation et un système de reconnaissance de voie d'un véhicule automobile
DE102016213782A1 (de) Verfahren, Vorrichtung und computerlesbares Speichermedium mit Instruktionen zur Bestimmung der lateralen Position eines Fahrzeuges relativ zu den Fahrstreifen einer Fahrbahn
DE102009046726A1 (de) Auswahl einer Parklücke aus mehreren erkannten Parklücken
WO2018141447A1 (fr) Procédé de localisation d'un véhicule à forte automatisation, par ex. un véhicule entièrement automatisé (haf), dans une carte de localisation numérique
DE19521917C2 (de) Verfahren und Vorrichtung zur Positionsbestimmung eines Fahrzeugs
DE102019217428A1 (de) Verfahren zum Betreiben eines Fahrerassistenzsystems, Fahrerassistenzsystem und Fahrzeug
DE102014220199B3 (de) Verfahren für ein Kraftfahrzeug mit einer Kamera, Vorrichtung und System
DE102018115317A1 (de) Verfahren und Fahrunterstützungssystem zum Betreiben eines Fahrzeugs oder zum Unterstützen eines Fahrers des Fahrzeugs unter Verwendung von Fahrspurinformation
DE102020107941A1 (de) Verfahren zum Bereitstellen von unterschiedlichen Fahrerassistenzfunktionen für ein Fahrzeug zum automatisierten Abfahren von zuvor aufgezeichneten Wegstrecken, Recheneinrichtung sowie Fahrerassistenzvorrichtung
DE102019132967A1 (de) Verfahren und Vorrichtung zur Ermittlung einer Fahrspur-Hypothese
DE102018213378B4 (de) Fahrassistenzsystem für ein Fahrzeug, Fahrzeug mit demselben und Fahrassistenzverfahren für ein Fahrzeug
DE102005059415B4 (de) Spurwechselassistenzsystem für ein Fahrzeug
DE102019107224A1 (de) Fahrunterstützungsverfahren
DE102021002918B4 (de) Verfahren zur Erkennung von für ein Fahrzeug sicherheitsrelevanten Objekten
DE102016222215A1 (de) Kraftfahrzeug mit einem Fahrerassistenzsystem, das eine Kamera aufweist
DE102021103680A1 (de) Verfahren zum betreiben eines parkassistenzsystems, computerprogrammprodukt, parkassistenzsystem und fahrzeug mit einem parkassistenzsystem
DE102020101375A1 (de) Verfahren zur Querverkehrswarnung für ein Fahrzeug mit Erkennung von Fahrspuren, Recheneinrichtung sowie Fahrerassistenzsystem
EP3996976A1 (fr) Procédé et dispositif d'assistance pour l'utilisation d'une carte numérique dans un véhicule
DE102022130172B4 (de) Verfahren und Fahrerassistenzsystem zur Unterstützung eines Fahrers beim Fahren in einem Proximitätsbereich einer Trajektorie
DE102018129556A1 (de) Verfahren zum wenigstens teilautonomen Fahren eines Fahrzeugs im Rahmen eines Parkvorgangs
DE102018009416A1 (de) Verfahren zur kooperativen Steuerung einer Bewegung eines Fahrzeuges
DE102018127474A1 (de) Verfahren zum Betreiben eines Sicherheitssystems für ein Kraftfahrzeug; Sicherheitssystem und Computerprogrammprodukt

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231130

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR