EP4320494A1 - Procédé de planification du comportement d'un véhicule - Google Patents

Procédé de planification du comportement d'un véhicule

Info

Publication number
EP4320494A1
EP4320494A1 EP22722431.8A EP22722431A EP4320494A1 EP 4320494 A1 EP4320494 A1 EP 4320494A1 EP 22722431 A EP22722431 A EP 22722431A EP 4320494 A1 EP4320494 A1 EP 4320494A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
area
phantom object
probability
length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22722431.8A
Other languages
German (de)
English (en)
Inventor
Chi Zhang
Florian Steinhauser
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZF Friedrichshafen AG
Original Assignee
ZF Friedrichshafen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZF Friedrichshafen AG filed Critical ZF Friedrichshafen AG
Publication of EP4320494A1 publication Critical patent/EP4320494A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • B60W60/00274Planning or execution of driving tasks using trajectory prediction for other traffic participants considering possible movement changes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4045Intention, e.g. lane change or imminent movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects

Definitions

  • the invention relates to a method for planning the behavior of a vehicle according to the preamble of claim 1.
  • the invention further relates to a system, a vehicle, a computer program and a computer-readable medium according to the independent claims.
  • the field of autonomous driving which typically includes various degrees of automated driving, such as B. assisted driving, semi-autonomous driving or fully autonomous driving, is an area with continuous technical development.
  • One reason for this is that moving a vehicle in an automated manner is usually a very complex task and presents a high number of technical challenges.
  • methods for automatic behavior planning of autonomous vehicles usually calculate probability values that express the probabilities for the occurrence of certain traffic situations, and then their planning is based at least in part on such probabilities.
  • DE 10 2019 108 142 A1 describes the use of certain probabilities when choosing options for actions of an automated vehicle.
  • An occlusion scenario is a traffic situation during which a certain part of the surroundings of a vehicle is blocked due to the fact that this part of the surroundings is occluded by an object. is not visible.
  • An occlusion scenario the field of view of a vehicle for which automated behavior is planned is occluded by an object.
  • objects can, for example, other vehicles such. B. cars, buses or trucks, or permanent infrastructure components, such. B. bridges, walls, buildings or the like.
  • the object of the invention is to overcome or at least mitigate the disadvantages mentioned above.
  • the problem is solved by a method for planning a behavior of a vehicle with respect to one or more occluded areas along a navigation path of the vehicle, the method comprising a step of identifying an occluded area where the occluded area(s) is/are identified, and comprises a step for generating a phantom object, in which at least one phantom object is generated for at least one of the occluded areas, wherein the occluded area(s) ) based on information from a catalog of pre-defined occlusion scenarios.
  • the method is usually a computer-implemented method.
  • the vehicle is typically an autonomous vehicle, ie the vehicle is capable of at least one type of autonomous driving, such as e.g. B. assisted driving, semi-autonomous driving or fully autonomous driving is configured.
  • assisted driving e.g. B. assisted driving, semi-autonomous driving or fully autonomous driving is configured.
  • the term "behaviour” describes a vehicle's response to an occluded area, such as slowing down when approaching and/or passing an object causing the occluded area, or gradually creeping forward when passing an object creating the occluded area. In typical embodiments, some occluded areas are identified along the navigation path.
  • a "navigation path" is typically a route that the vehicle is expected to follow.
  • Identifying an occluded area typically means detecting that an occluded area is at least likely to occur in the navigation path. In general, the identification can be carried out by analyzing map data or by analyzing sensor data, in particular sensor data from the vehicle, or a combination of both occur.
  • a "phantom object” is a virtual object located in a covered area. Examples of phantom objects are other vehicles or pedestrians.
  • the phrase "defined based on information from a catalog of predefined concealment scenarios" may, for example, mean matching the identified covered area to a predefined type of covered area found in the covert scenario catalog and/or specifying certain variables and/or constants corresponding to the chosen type of hidden area to include certain values.
  • Such a method for planning the behavior of a vehicle thus generalizes the management of occlusion scenarios in the behavior planning of autonomous vehicles, making it possible to easily and reliably deal with occluded areas along a navigation path of the vehicle.
  • the occlusion scenario catalog includes different occlusion scenarios and scenario information for each occlusion scenario.
  • the concealment scenario catalog can contain different types of concealed areas as concealment scenarios and/or information about each of these types of concealed areas, such as e.g. B. variables or constants that describe the different types of hidden areas include.
  • the occlusion scenario catalog includes one or more of the following types of occlusion areas: a bus stop, a pedestrian crossing, a school, a taxi rank, a lane, or an intersection.
  • the occlusion scenario catalog includes all possible occlusion scenarios that need to be managed by the vehicle.
  • the concealment scenario catalog is defined, typically offline, and includes definitions of various risky concealed areas.
  • the concealment scenario catalog is then used, usually online, to complete the step of identifying the concealed area, for example by accessing map data and/or sensor data and/or navigation data of the vehicle and by defining the concealed area(s) based on these data and information found in the occlusion scenario catalogue.
  • the step of generating the phantom object includes calculating a probability of appearance for the phantom object, the probability of appearance describing the probability of the phantom object emerging from its concealed area into a field of view of the vehicle.
  • its occluded area is typically understood to refer to the particular occluded area in which the phantom object has been virtually placed.
  • the calculation of such a probability of occurrence has the advantage that it makes it possible to find an appropriate compromise between traffic safety on the one hand and traffic flow on the other hand when operating the vehicle.
  • the probability of appearance includes a static component, wherein the static component preferably takes into account a map and/or road topology information, and/or wherein the static component is preferably dependent on an initial environment probability and/or a phantom object distance and/or a distance threshold.
  • the initial environment probability is typically a value between 0 and 1 and is typically defined in the occlusion scenario catalog for each type of occluded area.
  • the initial environment probability is an example of the aforementioned scenario information.
  • the "phantom object distance" typically describes a distance between a beginning of a braid risk area, which is in a covered area, and the phantom object.
  • a high-risk area is typically the area of particular interest in a covert area of some type.
  • the high-risk area is typically the crosswalk itself, while the pedestrian crossing type covert area also includes areas near the Pedestrian crossing may include, but which are not part of the pedestrian crossing itself. Similar definitions apply to the other high-risk areas, such as B. bus stops, taxi ranks, intersections or the like, too.
  • the "Distance Threshold" is a special value of the phantom object distance, after which the static component has a value of 0.
  • the distance threshold is preferably outside the high risk area but within the covered area.
  • the static component has the value of the initial environment probability throughout the high risk areas.
  • the static component falls, preferably linearly, from one end of the high risk area to the other Distance threshold where it takes the value 0.
  • the probability of appearance includes a dynamic component, the dynamic component preferably at least indirectly taking into account a geometric modification of the covered area between two points in time and/or the dynamic component preferably being dependent on a length of a phantom object, the length of a phantom object being considered as a Length within a covered area, within which exactly one phantom object is expected, is defined, with the length of a phantom object preferably being measured in a direction perpendicular to the longitudinal axis of the vehicle and/or with the dynamic component preferably being dependent on an increase in the field of vision, with the Field of view increase is preferably a length measured in a direction perpendicular to a longitudinal axis of the vehicle, the length of a phantom object and the field of view increase being typically parallel.
  • a dynamic component is a particularly advantageous way of achieving a good compromise between road safety and traffic flow.
  • the static component is calculated according to the following equation where Penv(d) is the static component, Kenv is the static component
  • the dynamic component is preferably calculated according to the following equation where PFOV(U) is the dynamic component, u is the field of view increase and L is the length of a phantom object, and/or the appearance probability is preferably calculated according to the following equation
  • a system for planning the behavior of a vehicle with respect to one or more obscured area(s) along a navigation path of the vehicle the system being configured to identify the obscured area(s) and at least create a phantom object for at least one of the occluded areas, the system being configured to define the occluded area(s) based on information from a catalog of predefined occlusion scenarios.
  • the system is preferably configured to carry out a method according to one of the previously described embodiments.
  • the system includes a occluded area identification module and/or a phantom object generation module and/or a
  • Appearance probability calculation module configured to calculate the above-mentioned appearance probability, which typically includes the static component and/or the dynamic component.
  • at least one of these modules, preferably all of these modules, is/are implemented by software code.
  • the system typically comprises means for performing at least one method according to one of the above-mentioned embodiments, in particular computer hardware means, such as e.g. B. processing units, memory devices or the like to participate in the various methods and/or steps mentioned above.
  • a computer program in a typical embodiment of the invention, comprises instructions which, when the program is executed by a computer, cause the computer to perform a method according to any of the above-mentioned embodiments.
  • the term "computer” should be understood to refer to any device or structure capable of executing the instructions.
  • the computer program can also be referred to as a computer program product.
  • a computer-readable medium comprises computer program code for performing a method according to one of the above-mentioned embodiments and/or comprises a computer program according to the above-mentioned embodiment.
  • the term “computer-readable medium” should be understood to include, but is not limited to, hard drives and/or servers and/or memory sticks and/or flash drives and/or DVDs and/or Blu-ray discs and/or or CDs.
  • the term “computer-readable medium” can also refer to a data stream that is brought about, for example, when a computer program and/or a computer program product is downloaded from the Internet.
  • Figure 1 a flowchart of an embodiment of a method according to the invention
  • Figure 2 a schematic drawing explaining the static component of the probability of appearance
  • Figure 3 a first schematic drawing that explains the dynamic component of the probability of occurrence
  • Figure 4 a second schematic drawing explaining the dynamic component of the appearance probability
  • FIG. 5 a schematic drawing that visualizes a length L of a phantom object. Description of preferred embodiments
  • FIG. 1 shows a flow chart of an embodiment of a method according to the invention.
  • the method includes a hidden area identification step S1 and a phantom object generation step S2.
  • steps S1, S2 are carried out continuously, typically by means of an endless loop.
  • certain embodiments do not require the method to be performed in an infinite loop.
  • steps S1 and S2 it is also possible for the method with steps S1 and S2 to be carried out only on request at certain points in time.
  • occluded area identification step S1 occluded areas are identified along a navigation path of the autonomous vehicle, in particular by analyzing map data and by analyzing sensor data provided by sensors present in the autonomous vehicle.
  • this map data and sensor data is matched to the content of a concealment scenario catalog such that the detected concealed areas along the vehicle's navigation path are matched to generalized types of concealed areas found in the concealment scenario catalogue.
  • the step S1 for identifying a covered area further comprises setting typical variables and/or constants for the different identified covered areas based on information that can be found in the concealment scenario catalogue.
  • phantom objects are generated for at least one of the identified covered areas, typically for all of the covered areas.
  • the generation of the phantom objects is typically based at least in part on information from the catalog of concealment scenarios and/or on the map data and/or on the sensor data.
  • the phantom object generation step S2 includes the calculation of the appearance probability(s) for the phantom object(s).
  • Figure 2 shows a schematic drawing explaining the static component of the appearance probability.
  • Figure 2 shows a vehicle 1 on a street 8.
  • a pedestrian crossing 7 goes across the street 8.
  • the vehicle 1 is located next to a blind 2, which is also referred to as an obscuring object. Because of the obstacle 2, the vehicle 1 is not in the Able to detect its complete surroundings: In particular, the obstacle generated
  • FIG. 2 also shows new pedestrians 4, only one of whom is provided with a reference number for the sake of simplicity.
  • Each pedestrian 4 has a hypothetical path 6 crossing the street 8 in a direction perpendicular to a longitudinal axis 5 of the vehicle 1 .
  • FIG. 1 also shows a representation of the static component Penv(d) for the traffic situation that is represented in FIG.
  • the pedestrian crossing 7 has a width Wz .
  • the static component Penv (d) has a constant value Kenv, namely the initial environment probability.
  • Initial environment probability Kenv is determined, for example, based on information from the catalog of concealment scenarios and/or on the map data.
  • the diagram in the lower part of FIG. 2 shows the static component Penv(d) of the probability of appearance as a function of the phantom object distance d. It can be seen that d is measured from the beginning of the pedestrian crossing 7 (the pedestrian crossing 7 itself is a high risk area). “Beginning” means the edge of the pedestrian crossing 7 that is closest to the vehicle 1.
  • FIG. 2 also shows a distance threshold D s that is outside of the actual high-risk area/pedestrian crossing 7 . Between the pedestrian crossing 7 and the distance threshold Ds, the static component Penv (d) of the appearance probability linearly decreases from the initial environment probability Kenv to 0. At points further than the distance threshold D s away from the pedestrian crossing 7, the static component Penv(d) corresponds to the occurrence probability 0.
  • FIG. 3 shows a first schematic drawing explaining the dynamic component of the appearance probability.
  • FIG. 3 shows a vehicle 1, but at two different points in time.
  • the vehicle 1ti is the vehicle at the current time ti
  • the vehicle 1to is the vehicle at a previous time, namely the time to.
  • the vehicle 1to, the vehicle 1ti is located on a street 8 over which a pedestrian crossing 7 goes.
  • the vehicle 1to, 1ti has a longitudinal axis 5.
  • the vehicle 1to, 1ti drives past the obstacle 2.
  • the obstacle 2 creates an occluded area
  • FIG. 3 for the 1ti vehicle.
  • FIG. 3 also shows two edges of a previously covered area 9.1, 9.2, namely the covered area as it was for the vehicle 1to.
  • FIG. 3 also shows a pedestrian 4 and an increase in the field of view ui.
  • the Field of view increase ui is a distance (typically measured in meters) measured in the direction perpendicular to the longitudinal axis 5 of the vehicle 1ti.
  • the increase in field of view ui is also measured in the direction of the hypothetical walking path 6 of the pedestrian 4 .
  • the field of view increase ui corresponds to the field of view increase at the height of the pedestrian 4.
  • the dynamic component of the appearance probability for the pedestrian 4 (which is a phantom object) after the time jump between the times to and ti becomes with a field of view increase ui calculated as shown in FIG. It can also be said that ui is the length of the hypothetical walking path 6 of the pedestrian 4 that is uncovered from the concealed area 3 between the times to and ti. In the case of FIG. 3, ui is large enough for pedestrian 4 to move out of covered area 3 . In other words, the dynamic component takes the value 1 according to the situation shown in FIG.
  • FIG. 4 shows a situation similar to the situation shown in FIG. However, in FIG. 4 the vehicle 1to, 1ti has moved a shorter distance along the obstacle 2 than in FIG.
  • the increase in field of view U2 is therefore smaller than the increase in field of view ui shown in FIG.
  • the increase in field of view U2 shown in FIG. 4 is such that the probability of the pedestrian 4 appearing is still so low that the pedestrian 4 remains in the concealed area 3 .
  • FIG. 5 shows a schematic drawing that visualizes the length L of a phantom object.
  • FIG. 5 shows how the length L of a phantom object is determined at a certain point in the horizontal direction of FIG.
  • the length L of a phantom object is the length measured in the direction of the hypothetical path 6 of the pedestrian 4 that contains exactly one pedestrian 4 .
  • vehicle 2 obstacle also known as obscured object
  • Penv(d) static component (of the probability of occurrence)

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un procédé de planification du comportement d'un véhicule (1, 1t0, 1t1) par rapport à une ou plusieurs zones cachées (3) le long d'un trajet de navigation du véhicule, ce procédé comprenant une étape d'identification de zone cachée (S1) dans laquelle la ou les zones cachées (3) sont identifiées, et une étape de génération d'objet fantôme (S2) dans laquelle au moins un objet fantôme (4) est généré pour au moins une des zones cachées (3). Dans ce procédé, pendant l'étape d'identification de zone cachée (S1), la ou les zones cachées (3) sont définies sur la base d'informations provenant d'un catalogue de scénarios de dissimulation prédéfinis.
EP22722431.8A 2021-04-09 2022-04-11 Procédé de planification du comportement d'un véhicule Pending EP4320494A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021203545 2021-04-09
PCT/EP2022/059650 WO2022214709A1 (fr) 2021-04-09 2022-04-11 Procédé de planification du comportement d'un véhicule

Publications (1)

Publication Number Publication Date
EP4320494A1 true EP4320494A1 (fr) 2024-02-14

Family

ID=81597841

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22722431.8A Pending EP4320494A1 (fr) 2021-04-09 2022-04-11 Procédé de planification du comportement d'un véhicule

Country Status (3)

Country Link
US (1) US20240124019A1 (fr)
EP (1) EP4320494A1 (fr)
WO (1) WO2022214709A1 (fr)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013215098A1 (de) * 2013-08-01 2015-02-05 Bayerische Motoren Werke Aktiengesellschaft Umfeldmodelle für Fahrzeuge
EP3091370B1 (fr) * 2015-05-05 2021-01-06 Volvo Car Corporation Procédé et agencement pour déterminer des trajectoires de véhicule sûres
WO2019089015A1 (fr) * 2017-10-31 2019-05-09 Nissan North America, Inc. Fonctionnement de véhicule autonome avec raisonnement d'occlusion explicite
DE102019108142A1 (de) 2019-03-29 2020-10-01 Bayerische Motoren Werke Aktiengesellschaft Auswählen einer Handlungsoption für ein automatisiertes Kraftfahrzeug

Also Published As

Publication number Publication date
WO2022214709A1 (fr) 2022-10-13
US20240124019A1 (en) 2024-04-18

Similar Documents

Publication Publication Date Title
DE102016203086B4 (de) Verfahren und Vorrichtung zur Fahrerassistenz
AT521607B1 (de) Verfahren und Vorrichtung zum Testen eines Fahrerassistenzsystem
EP3543985A1 (fr) Simulation des situations de circulation diverses pour un véhicule d'essai
DE10133945A1 (de) Verfahren und Vorrichtung zum Austausch und zur Verarbeitung von Daten
DE102016007899B4 (de) Verfahren zum Betreiben einer Einrichtung zur Verkehrssituationsanalyse, Kraftfahrzeug und Datenverarbeitungseinrichtung
EP3024709B1 (fr) Fourniture efficace d'informations d'occupation pour l'environnement d'un véhicule
DE102014003343A1 (de) Verfahren zum Ermitteln eines Spurwechselbedarfs eines Systemfahrzeugs
EP4027245A1 (fr) Procédé mis en uvre par ordinateur destiné à la détermination des valeurs de similitude des scénarios de circulation
EP3850536A1 (fr) Analyse de scénarios spatiaux dynamiques
DE102020200169B3 (de) Verfahren zur Zusammenführung mehrerer Datensätze für die Erzeugung eines aktuellen Spurmodells einer Fahrbahn und Vorrichtung zur Datenverarbeitung
DE102013214631A1 (de) Effizientes Bereitstellen von Belegungsinformationen für das Umfeld eines Fahrzeugs
EP4320494A1 (fr) Procédé de planification du comportement d'un véhicule
DE102019101613A1 (de) Simulieren verschiedener Verkehrssituationen für ein Testfahrzeug
DE102019206847A1 (de) Verfahren und Vorrichtung zum Betreiben eines automatisierten Fahrzeugs
DE10254525A1 (de) Verfahren und Vorrichtung zur Vorhersage des Fahrzeugverhaltens sowie diesbezügliches Computer-Programm-Produkt
DE102020131766A1 (de) Fahrzeugsystem und Verfahren zur Ermittlung eines Parkbereichs für ein Fahrzeug
DE102019115788A1 (de) Automatische Identifizierung einer Ego-Fahrspur und Anpassung einer Fahrtrajektorie für eine Ego-Fahrzeug in Abhängigkeit von der Ego-Fahrspur
DE102019200145A1 (de) Vorrichtung und Verfahren zum Verifizieren von elektronischen Horizonten
DE102022201406B4 (de) Verfahren zum Planen eines Verhaltens eines Fahrzeugs
DE102018217932A1 (de) Verfahren und Vorrichtung zum Betreiben eines automatisierten Fahrzeugs
DE102022129805B3 (de) Verfahren zum Darstellen von symbolischen Fahrbahnmarkierungen auf einer Anzeigeeinheit eines Fahrzeugs sowie Anzeigesystem für ein Fahrzeug
DE102020208878A1 (de) Verfahren zur Kurvenerkennung
DE102019214146A1 (de) Verfahren zum Bereitstellen eines Umfeldmodells, Umfeldmodellbereitstellungssystem und Computerprogramm
WO2024008453A1 (fr) Procédé de prédiction d'une influence d'un usager de la route sur au moins un autre usager de la route, et procédé de fonctionnement d'un véhicule
EP4124542A1 (fr) Procédé et dispositif de détection d'obstacles sur une route

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231006

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR