EP3176046A1 - Method and device in a motor vehicle for automated driving - Google Patents

Method and device in a motor vehicle for automated driving Download PDF

Info

Publication number
EP3176046A1
EP3176046A1 EP16197937.2A EP16197937A EP3176046A1 EP 3176046 A1 EP3176046 A1 EP 3176046A1 EP 16197937 A EP16197937 A EP 16197937A EP 3176046 A1 EP3176046 A1 EP 3176046A1
Authority
EP
European Patent Office
Prior art keywords
scenario
motor vehicle
elementary
situation
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP16197937.2A
Other languages
German (de)
French (fr)
Inventor
David Perdomo Lopez
Christian JÖRDENS
Lutz Junge
Michael Darms
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to DE102015224338.9A priority Critical patent/DE102015224338A1/en
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of EP3176046A1 publication Critical patent/EP3176046A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3492Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18154Approaching an intersection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/42Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/52Radar, Lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/60Traffic rules, e.g. speed limits or right of way
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/65Data transmitted between vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2720/00Output or target parameters relating to overall vehicle dynamics
    • B60W2720/24Direction of travel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2300/00Purposes or special features of road vehicle drive control systems
    • B60Y2300/18Propelling the vehicle
    • B60Y2300/18008Propelling the vehicle related to particular drive situations
    • B60Y2300/18158Approaching intersection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/12Network-specific arrangements or communication protocols supporting networked applications adapted for proprietary or special purpose networking environments, e.g. medical networks, sensor networks, networks in a car or remote metering networks

Abstract

The invention relates to a method for the automated driving of a motor vehicle (50), comprising the following steps: providing environment data (7), recognizing objects (113) in the environment data (7), selecting a current scenario (12, 100) by a Scenario interpretation device (3) on the basis of the detected objects (113), wherein a scenario (12, 100) has at least one elementary situation (13, 101, 101.1 - 101.X), which as attributes (108) comprises at least one monitoring area (113). 110.1 - 110.7) and an enabling criterion (114) are assigned, one after another for each of the elementary situations (13, 101, 101.1 - 101.X) of the current scenario (12, 100): estimating a destination (111.1 - 111.7), scheduling a trajectory (112.1 - 112.7) to the target point (111.1 - 111.7) by a maneuver planning device (4), querying information about the surveillance area (110.1 - 110.7) in the environment perception device (2) by the scenario interp (3) checking the existence of the at least one release criterion (114) by the scenario interpretation device (3), and if the at least one release criterion (114) is met, automated driving along the trajectory (112.1 - 112.7) to the destination point ( 111.1 - 111.7) by a controller (5). Furthermore, the invention relates to an associated device (1).

Description

  • The invention relates to a method and a device in a motor vehicle for automated driving in an environment.
  • Modern motor vehicles have a variety of assistance systems that assist the driver in driving the vehicle. Increasingly, semi-autonomous and autonomous systems are used which permit semi-automatic or fully automatic control of the motor vehicle. The processing of the resulting data sets leads to a large computational effort.
  • From the DE 10 2010 010 856 A1 For example, a method for automatically assisting a driver of a motor vehicle is known. In this case, from the position data of the motor vehicle and the data of the electronic image of the vehicle environment, a lane corridor accessible by the motor vehicle from the current position is determined, wherein the lane corridor is bounded on both sides by a lane corridor border and falls below a distance of the motor vehicle to a lane corridor border the driver oversteer Querführungs- and / or longitudinal guidance intervention is performed, in particular by acting on a steering of the motor vehicle with a steering torque, by changing the steering ratio characteristic of active steering of the motor vehicle, by changing the accelerator pedal characteristic of the accelerator pedal of the motor vehicle and / or by change the engine torque of the engine of the motor vehicle.
  • From the DE 10 2010 018 333 A1 a method and an evaluation device for evaluating information of at least one of at least one optical camera recorded image is known. A method for evaluating information of at least one image recorded by at least one optical camera, wherein the at least one optical camera is part of a vehicle, is described. The method comprises the following steps: At least one image is recorded by means of the at least one optical camera at least during a drive of the vehicle on a roadway. Furthermore, a determination of an anticipated course of the road in a Environment of the vehicle by means of at least one determining device of the vehicle. In addition, determining at least one area of the at least one image, which includes at least a portion of the determined probable course of the road in the vicinity of the vehicle takes place. In addition, there is an evaluation of image information contained in the at least one determined region of the at least one image.
  • From the DE 10 2009 008 745 A1 a method and system for automatic traffic guidance is known. In this context, traffic-relevant information or traffic relating to an environment or surroundings of several road users or of a traffic infrastructure (eg devices which detect and / or regulate traffic) are automatically detected and transmitted as environment data to a central control unit. The environment data are then displayed by the central control unit in a general environment model, so that this general environment model describes the environment of the road users. In order to support the traffic guidance of the road users, those traffic guidance data concerning an environment of a respective road user are automatically transmitted to this road user by the central control unit from the general environment model.
  • The invention is based on the technical problem of providing a method and a device for the automated driving of a motor vehicle in an environment in which a reduced amount of computation is necessary in the interpretation of the environmental data.
  • The technical problem is solved by a method with the features of claim 1 and a device having the features of claim 5. Advantageous embodiments of the invention will become apparent from the dependent claims.
  • In the following, a scenario will be a set of temporally ordered and interconnected elementary situations. Such a scenario may be, for example, a left turn, straight ahead or right turn on an intersection.
  • An elementary situation denotes a moment or a temporal portion within a scenario. The elementary situation includes a scene and information of the motor vehicle.
  • A scene contains a scene where the scenery includes static objects of the environment. The static objects are, for example, lane markings, traffic signs, etc. Furthermore, the scene also includes dynamic objects, such as traffic light phases, other road users, etc.
  • There is provided a method of automated driving of a motor vehicle, comprising the steps of: providing environment data, detecting objects in the environment data by an environment perception device, selecting a current scenario from a provided set of known scenarios by a scenario interpretation device based on the recognized objects, wherein a scenario has at least one elementary situation, and wherein each elementary situation as attributes at least one monitoring area and a release criterion are assigned, and sequentially for each of the elementary situations of the current scenario: estimating a target point for the current elementary situation by the scenario interpretation device, scheduling a trajectory to the destination point by a maneuver scheduler, querying information about the surveillance environment associated with the elementary situation at the environment perception device by the scenario interpretation means, checking existence of the at least one elementary release criterion based on the requested information by the scenario interpretation means, and if the at least one release criterion is met, automated driving along the trajectory to the target point Controlling at least one actuator of the motor vehicle by a controller.
  • Furthermore, there is provided an automated car driving apparatus comprising environment awareness means for recognizing objects in provided environment data, scenario interpretation means for selecting an actual scenario from a provided set of known scenarios based on the detected objects, a scenario being at least elemental Situation, and wherein each elementary situation as attributes at least one monitoring area and a release criterion are assigned, a maneuver planning device and a controller, wherein the scenario interpretation means is adapted to estimate a target point for a present elementary situation, information about the elementary Situation monitoring area to interrogate from the environment perception device and a presence of the at least one release criterion of the current elementary situation on G check the roundness of the requested information, and wherein the controller is designed in such a way, provided that the at least one release criterion is met, automatically drive the motor vehicle along a planned by the maneuver planning device trajectory to the target point by controlling at least one actuator of the motor vehicle, and the scenario interpretation device is further configured to process, after reaching the destination point, a next elementary situation of the current scenario.
  • The core idea of the invention is to improve the scenario interpretation. For this purpose, a scenario is broken down into several elementary situations. The elementary situations each have at least one monitoring area and a release criterion as attributes assigned to them. The idea behind this is to evaluate not all information, but only the information relevant to a current elementary situation about the environment of the motor vehicle.
  • If the scenario interpretation device recognizes or selects a specific scenario from a set of scenarios provided, for example a left turn on an intersection, this scenario comprises several elementary situations which have to be passed through step by step. In every current elementary situation, the complete environment of the motor vehicle can or must no longer be monitored, but only a monitoring area assigned to this elementary situation.
  • For example, if the motor vehicle is at the beginning of the scenario in an elementary situation in front of an intersection without a traffic light, which regulates the traffic, and there is a zebra crossing (ie pedestrians always have priority), the scenario interpretation means that the motor vehicle is in one elementary situation without traffic lights and the surveillance area may only include the zebra crossing and the areas to the left and right of the zebra crossing on the sidewalk. In this monitoring area, it is then checked whether the release criterion assigned to the current elementary situation is fulfilled. If, for example, it is provided as the release criterion that there are no pedestrians on the zebra crossing or in its vicinity to the right and left of it, the scenario interpretation device inquires at the surroundings perception device whether pedestrians are present in the area. If it then receives from the environment perception device the information that there are no pedestrians, then the release criterion is fulfilled and the scenario interpretation device initiates the automated driving to a destination estimated by the scenario interpretation device. Then the next elementary situation is worked off.
  • The breakdown of the scenario into several elementary situations has the advantage that a complex scenario in road traffic can be solved within the framework of substeps. The
  • Decomposition into elementary situations makes it possible to monitor only a subarea of the environment, which means that only a reduced amount of data has to be processed or evaluated. This results in a considerable reduction of the computational effort.
  • The set of provided scenarios includes, for example, a left turn or a right turn on a T-junction or an X-junction, a straight ahead at an X-junction with and without a traffic sign (stop sign, right of way) or with a traffic signal system, which regulates the traffic , For example, a scenario may be previously provided or learned, for example, for turning from a public road to a private driveway, etc.
  • In order to select the scenario, the scenario interpretation device preferably also accesses information about a planned route provided by the maneuver planning device. The route includes, for example, the roads that are to be traveled from a starting point to a destination of the automated journey. By taking the route into account, it is easy to estimate a scenario in good time, for example, turning left when the route involves such a maneuver.
  • In one embodiment, it is provided that the environment data are provided by at least one sensor, wherein the at least one sensor detects the environment of the motor vehicle. Suitable sensors include all known sensor systems, such as cameras, ultrasonic sensors, laser scanners, radar sensors, TopView cameras, etc.
  • In one embodiment, it is provided that the environment data is provided alternatively or additionally in the form of an environment map. This may be, for example, a road map in addition to the course of the road also more information are stored, such as street signs, right of way rules, traffic lights etc. Such a road map facilitates the information stored therein, the selection of a scenario, for example, even if not complete Environment data with high probability between a T-junction and an X-junction can be distinguished.
  • In a further embodiment it is provided that the environment data is provided alternatively or additionally via a communication interface. this makes possible For example, a consideration of Car2X data, which are provided by other motor vehicles or traffic management systems.
  • In one embodiment, it is provided that the scenario interpretation device continuously checks whether the selected current scenario is still present or a change of the current scenario is necessary. This allows dynamic adaptation of the current scenario to the environment. For example, a scenario erroneously selected due to insufficient data of the environment perception device can be corrected in this way. If, for example, a T-junction has been selected and it turns out, when approaching the intersection, that it is not a T-junction but an X-junction, this can be corrected by providing a corresponding scenario for the T-junction X-crossing is selected.
  • In one embodiment, it is provided that within the at least one monitoring area of an elementary situation future trajectories of detected objects are predicted by a prediction device. This makes it possible to predict or estimate the behavior of objects in the future. Estimates can be made about the estimated future trajectories about where the objects will be, how they will be positioned relative to the motor vehicle, and whether an object in the current elementary situation is relevant to the motor vehicle.
  • In one embodiment, it is further provided that a release criterion is that a trajectory of the motor vehicle to the next target point is collision-free with trajectories of other objects. This allows a conflict-free automated journey to the corresponding destination of the current elementary situation. To increase safety, a safety corridor can additionally be defined around the trajectories, which must be observed to avoid collisions.
  • An embodiment provides that at least one elementary situation has a further attribute, wherein the further attribute is a specific object whose presence is queried by the scenario interpretation device at the environment perception device. In this way, additional information can be stored, such as a specific traffic sign. In addition, an associated release criterion can additionally be stored with the object.
  • In a further embodiment it is provided that the estimation of the destination point is dynamically adapted by the scenario interpretation device on the basis of the information requested by the environment perception device. The ability to dynamically adapt a target point allows a flexible response to the circumstances or changes in the environment of the motor vehicle. This may be necessary, for example, if information becomes available in the current elementary situation that was not previously available in the original estimation of the destination point. For example, it is possible to react flexibly to a construction site, which is only detected when approaching.
  • In general, it is always possible that an automated journey can be influenced or aborted. For example, the driver of the motor vehicle can always intervene in the automated drive or other devices provided for this purpose of the motor vehicle can stop or correct the automated drive. If, for example, a collision with another object takes place, the automated drive is interrupted and the motor vehicle is brought to a standstill.
  • The invention will be explained in more detail with reference to preferred embodiments with reference to the figures. Hereby show:
  • Fig. 1
    a schematic representation of components of a typical scenario;
    Fig. 2
    a schematic representation of a typical traffic situation at an X-junction to explain possible points of conflict when turning left;
    Fig. 3a
    a schematic representation of a typical traffic situation at an X-junction to explain different scenarios, taking into account the right-to-left right of way rule;
    Fig. 3b
    a schematic representation of a typical traffic situation at an X-junction for explaining different scenarios, taking into account the ground rules established by traffic signs, wherein the motor vehicle enjoys priority;
    Fig. 3c
    a schematic representation of a typical traffic situation at an X-junction to explain different scenarios taking into account right-of-way rules based on traffic signs, whereby the motor vehicle does not enjoy right of way;
    Fig. 3d
    a schematic representation of a typical traffic situation at an X-junction to explain different scenarios, taking into account the regulated by a traffic signal right of way, where compatible scenarios are considered;
    Fig. 3e
    a schematic representation of a typical traffic situation at an X-junction for explaining different scenarios, taking into account the regulated by a traffic signal priority, with conditionally compatible scenarios are considered;
    Fig. 4
    a schematic representation of the linking of several elementary situations within a scenario;
    Fig. 5a
    a schematic representation of an exemplary scenario of successive elementary situations when turning;
    Fig. 5b
    a schematic representation of further successive elementary situations of the exemplary scenario turn (continuation of Fig. 5a );
    Fig. 6a
    a schematic representation for explaining target points, which are assigned to the individual elementary situations when turning left;
    Fig. 6b
    a schematic representation for explaining target points, which are assigned to the individual elementary situations in the right turn;
    Fig. 7a
    a schematic flow diagram for explaining turning maneuver, wherein the turn maneuver is based on elementary situations;
    Fig. 7b
    a legend to the in Fig. 7a shown elementary situations;
    Fig. 8
    a schematic representation of an apparatus for automated driving;
    Fig. 9a
    a schematic representation of a traffic situation to explain an elementary situation;
    Fig. 9b
    a schematic representation of the traffic situation to explain a subsequent elementary situation;
    Fig. 9c
    a schematic representation of the traffic situation to explain a subsequent elementary situation;
    Fig. 9d
    a schematic representation of the traffic situation to explain a subsequent elementary situation.
  • Fig. 1 shows a schematic representation of the structure of a scenario 100. A scenario 100 is, for example, a left turn at an intersection. The scenario comprises at least one elementary situation 101. If the scenario 100 comprises several elementary situations 101, these are arranged in time and connected to one another. Each elementary situation 101 comprises a scene 102 and a description of the state 103 of the motor vehicle. The scene 102 includes, for example, a scene 104 and dynamic elements 105. The scene 104 includes all static objects 113 of the environment, the dynamic elements 105 include, for example, traffic light phases, etc. The state 103 of the motor vehicle includes, for example, an action 106 of the driver and automated actions 107 of a Control.
  • Fig. 2 shows a schematic representation of a typical traffic situation 60 at an intersection 61 for explaining possible conflict points 71, 72, 73, 74, 75 which may occur when turning left of a motor vehicle 50 with other road users. The intersecting roads 62.1, 62.2 each have one lane 63 per direction. From each direction of the intersection 61 there is respectively a motor vehicle 50, 51, 52, 53. In addition, there are a plurality of pedestrians 81, 82, 83 which connect the roads 62.1 , 62.2 on crosswalk 64.1, 64.2 crossing or crossing. Depending on how the other motor vehicles 51, 52, 53 are planning to proceed, the left-turn vehicle 50 experiences the points of conflict 72, 73, 74 at which the trajectories 90, 91, 92, 93 of the motor vehicles 50, 51, 52 , 53 can collide with each other. In addition, there are two possible conflict points 71, 75 of the trajectory 90 of the motor vehicle 50 with the trajectories 94, 95 of the pedestrians 81, 82, 83.
  • A first point of conflict 71 results with a pedestrian 81 coming from the right. The trajectory 94 of the pedestrian 81 and the trajectory 90 of the motor vehicle 50 intersect. The motor vehicle must therefore wait until the pedestrian 81 has crossed the crosswalk 64.1. Only then can the motor vehicle 50 pass the crosswalk 64.1. A further point of conflict 72 then results with the vehicle 53 coming from the left, since the trajectories 90, 93 overlap. Depending on the traffic regulations currently applicable to the traffic situation 60, the motor vehicle 50 must let the motor vehicle 53 coming from the left pass by. Here, for the road 62.1, on which the motor vehicle 50 is located, a "notice the right of way" sign 65 and for the road 62.2, on which the other motor vehicle 53 is located, a priority sign 66 is present. The motor vehicle 50 must therefore wait until the other motor vehicle 53 has passed the junction 61. Once this has happened, the next conflict points 73, 74 with motor vehicles 51, 52 are found, which come from the right or from the opposite direction. These motor vehicles 51, 52 also have priority in the traffic situation 60, so that the motor vehicle 50 must wait for it to pass. The last point of conflict 75 may be a collision with the pedestrians 82, 83, which want to cross the crosswalk 64.2. Again, the motor vehicle 50 must wait until they have crossed the crosswalk 64.2.
  • The conflict points 71, 72, 73, 74, 75 illustrate in this example that the traffic situation 60 can be broken down into several sub-problems or sub-steps. Overall, the traffic situation 60 in the context of the invention represents a scenario which corresponds to the turning left from a single-lane road 62.1 at an X-junction 61 to a one-lane road 62.2 without right of way. The situations or problems to the individual conflict points 71, 72, 73, 74, 75 then correspond to the elementary situations of the scenario.
  • The FIGS. 3a to 3e show a schematic representation of in Fig. 2 already shown traffic situation at an X-junction to explain the possible maneuvers under consideration of different right of way and right of way. Each of the representations corresponds to a scenario.
  • Generally, there are three possible maneuvers at the X-junction that can be performed by the motor vehicle: straight ahead, right turn and left turn. This results in a different scenario for each of the possibilities, whereby the scenarios can each consist of different elementary situations.
  • Fig. 3a shows the three maneuvers or scenarios, if the right of way is regulated according to the right-to-left rule. In all three scenarios, the rights of the pedestrians 81 on the respective crosswalk 64 are to be observed. If motor vehicle 50 is traveling straight ahead, there is a point of conflict 71 with a pedestrian from the right. Then there is a point of conflict 72 with another motor vehicle 51 from the right and another point of conflict 73 with a pedestrian 81 from the right.
  • In the scenario where the motor vehicle 50 wants to turn right, the pedestrian 81 must be serviced from the right, then pedestrians 81 who want to cross the road 62.2 into which the motor vehicle 50 turns. When turning left, there is a point of conflict 71 with a pedestrian 81 from the right, then a point of conflict 72 with another motor vehicle 51 from the right and a point of conflict 73 with an oncoming motor vehicle 52, finally a last point of conflict 74 with pedestrians 81 crossing the street 62.2. in which the motor vehicle 50 turns.
  • The FIGS. 3b and 3c show the two scenarios in which the right of way rights are regulated by means of traffic signs, both in the event that the motor vehicle 50 has priority and in the event that the motor vehicle 50 has no right of way. With right of way, the motor vehicle 50 must wait for pedestrians 81 on the crosswalk 64 while driving straight ahead. Even when turning to the right, it is also necessary to wait only for the pedestrians 81 of the crosswalks 64. When turning left, however, in addition to an oncoming motor vehicle 51 has to be maintained. In the non-right-of-way scenarios, pedestrians 81 and other vehicles 51, 53 must be serviced from right and left. When turning left, pedestrians 81 and other motor vehicles 51, 52, 53 must be serviced from all other directions.
  • The 3d figures and 3e show the scenarios in the event that a traffic signal (traffic light) controls the traffic. A distinction is made between compatible and conditionally compatible scenarios. A tolerable scenario has no conflict points. By contrast, a conditionally compatible scenario has at least one potential point of conflict, for example an intersection with a trajectory of a pedestrian (or cyclist) or with a trajectory of another motor vehicle. The rules of priority applicable in the scenario decide whether a maneuver is compatible or conditionally compatible.
  • When driving straight ahead, there is no need to wait for another road user. When turning to the right, it is only necessary to wait for the pedestrian 81 in the condition of limited compatibility; in the tolerable situation, on the other hand, there is no potential point of conflict, so that the motor vehicle 50 can easily turn. When turning left must be in the compatible Scenario not to be waited for other road users, since the motor vehicle 50 enjoys priority over all other road users. In the conditionally tolerable scenario, however, the motor vehicle 50 must be maintained on an oncoming motor vehicle 52 and pedestrian 81.
  • Fig. 4 shows a schematic representation of a scenario 100, which has several elementary situations 101.0, 101.1, 101.2, 101.3, 101.X. The elementary
    Situations 101.0, 101.1, 101.2, 101.3, 101.X are linked with one another in terms of time and each have as attributes 108 at least one monitoring area 110 and an enabling criterion 114. Only when in a current elementary situation 101.0, 101.1, 101.2, 101.3, 101.X the respective release criterion 114 is fulfilled, is it changed to the subsequent elementary situation 101.1, 101.2, 101.3, 101.X. It is possible that for an elementary situation 101.0 there is more than one potential subsequent elementary situation 101.1, 101.2. Which of the potentially subsequent elementary situations 101.1, 101.2 follows the elementary situation 101.0 results from the current environment or from acquired environmental data. For example, if the elementary situation 101.0 describes an approach to an intersection, the case may be that it is an X-junction or a T-junction. However, the environment data provided does not allow a clear decision, so that only two hypotheses 109 can be set up: the X-intersection then corresponds to the one potential subsequent elementary situation 101.1, the T-intersection of the second potentially subsequent elementary situation 101.2. If the motor vehicle then continues to approach the intersection, the environment data provided will generally become better, so that at some point a clear decision can be taken or a correction 115 of the previous hypothesis can be carried out. In this way, the current scenario is continuously reviewed and, if necessary, corrected on the basis of currently provided environment data. This allows an always up-to-date interpretation.
  • The FIGS. 5a and 5b show the elementary situations 101.1, 101.2, 101.2, 101.4, 101.5, 101.6, 101.7 of an exemplary scenario, which describes the turning of a motor vehicle 50 from a three-lane road to a two-lane road, the right of way is regulated by traffic signs and the motor vehicle 50 no right of way has. In the first elementary situation 101.1, the motor vehicle 50 changes to the left lane to the left and approaches the intersection 61. The scenario interpreter recognizes that the motor vehicle 50 is approaching an X-junction 61 and a left turn is planned. This elementary scenario 101.1 has as an attribute a monitoring area 110.1, which comprises the pedestrian crossing and the areas on the left and right on the pedestrian path for monitoring a potential point of conflict.
  • The motor vehicle 50 has to wait for passing pedestrians (or cyclists) and only when pedestrians no longer want to cross or cross the street, the release criterion associated with the elementary situation 101.1 is met.
  • Compared to monitoring the entire intersection 61, monitoring only the monitoring area 110.1 is advantageous since much less information must be evaluated. For the assessment of the release criterion in an elementary situation, only one monitoring of the associated monitoring areas is necessary in each case. The scenario interpretation device queries at the environment perception device whether objects are within the surveillance area 110.1. In this case, future trajectories of the objects can also be estimated on the basis of detected or estimated movement data, for example via a prediction device, and then made available to the scenario interpretation device.
  • The next elementary situation 101.2 follows the first elementary situation 101.1 and takes into account as a potential point of conflict a vehicle coming from the left. Accordingly, the elementary situation 101.2 is assigned as an attribute a monitoring area 110.2 on the crossing lane to the left in front of the motor vehicle 50 and a corresponding release criterion (no other motor vehicle from the left).
  • Another elementary situation 101.3 considers as potential conflict point another motor vehicle coming from the right. Accordingly, the elementary situation 101.3 is assigned as an attribute a monitoring area 110.3 on the crossing lane to the right in front of the motor vehicle 50 and a corresponding release criterion (no other motor vehicle from the right).
  • Another elementary situation 101.4 summarizes the two previous elementary situations 101.2, 101.3. This elementary situation 101.4 is then assigned as an attribute to a monitoring area 110.4 on the crossing lane both on the right and left in front of the motor vehicle 50 and a corresponding release criterion (no other motor vehicle from left or right).
  • In a subsequent elementary situation 101.5 ( Fig. 5b ), the motor vehicle 50 is already in the middle of the intersection 61. Here, the motor vehicle 50 must wait for pedestrians, accordingly, the elementary situation 101.5 assigned a surveillance area 110.5, which includes the walkways to the left and right of the motor vehicle 50 and the pedestrian crossing. An associated approval criterion is that no pedestrian is on the pedestrian crossing and no pedestrians cross the street anymore intended. A corresponding elementary situation 101.7 results for the motor vehicle 50 in the right turn. Here too, a surveillance area 110.7 must be monitored for weaker road users who want to cross or cross the road.
  • In the same position of the motor vehicle 50 at the intersection, for example, another elementary situation 101.6 is that the motor vehicle 50 must wait for an oncoming motor vehicle. The monitoring area 110.6 and the release criterion are then defined accordingly.
  • The Figures 6a and 6b exemplarily show by the scenario interpretation means estimated target points 111.1, 111.2, 111.3, 111.4 for a left turn or estimated target points 111.5, 111.6, 111.7 for a right turn at an intersection 61. In each of the elementary situations (see FIGS. 5a and 5b ) one
    Target point 111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7 estimated. In principle, it is also possible that the same destination 111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7 is or is assigned to several elementary situations. Each time the at least one release criterion of a current elementary situation is met, the motor vehicle 50 is automatically assigned to the elemental situation
    Target point 111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7. A maneuver planning organization assumes the planning of the concrete
    Trajectory 112.1, 112.2, 112.3, 112.4, 112.5, 112.6, 112.7 of one
    Destination point 111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7 to the next and a controller performs the automated journey along the planned
    Trajectory 112.1, 112.2, 112.3, 112.4, 112.5, 112.6, 112.7 by controlling at least one actuator of the motor vehicle 50 by.
  • Fig. 7a Figure 12 is a schematic flow chart for explaining the maneuver "turn off" at an intersection, the maneuver being based on elementary situations. The flowchart includes several scenarios, which are dynamically adjusted depending on the environment. In each case, parts of the flowchart correspond with the elementary situations, which in the FIGS. 5a and 5b have been described. The elementary situations are for better overview in Fig. 7b presented as a legend, using the same names as in the FIGS. 5a and 5b , Furthermore, in the Figures 6a and 6b defined target points used.
  • In the initial situation 200, the motor vehicle approaches an intersection. It changes 201 to a suitable lane to make a corresponding left turn or maneuver To prepare right turn. If it has changed the lane, then it continues to approach the intersection. In a first query 203, the scenario interpretation device inquires at the environment perception device whether a traffic light system (traffic light) is present.
  • If this is the case, 204 asks which traffic light phase is present. Here are three options (red, green, off). If the traffic light is switched to "red", the vehicle must wait. It is then automatically sent to a first first target point estimated by the scenario interpretation device (target points 111.1, 111.5 in FIG Fig. 6a respectively. Fig. 6b ) approached before the intersection 205. Once arrived at the first destination point, the interrogation 204 of the traffic light phase is repeated. If the traffic light phase is "green", it is checked in a query 206 whether the traffic flow is compatible or not (or conditionally compatible). If the traffic flow is compatible, then it can be completely turned (to the left or right) 207, since the motor vehicle enjoys priority. The scenario is finished. If, on the other hand, the traffic flow is only conditionally compatible, that is, the motor vehicle has to observe priority rules despite the green traffic light phase, a query 208 first of all clarifies whether the maneuver to be performed provides for a left turn or a right turn. If it is intended to turn to the left, a check is made in a following query 209 as to whether traffic is coming from the opposite direction. If this is the case, the motor vehicle is automated until the second destination point estimated by the scenario interpretation device (destination point 111.2 in FIG Fig. 6a ) 210. There, the query 209 is repeated. If no traffic comes from the opposite direction, the motor vehicle is automatically automated up to the third estimated destination (destination point 111.3 in Fig. 6a 211. There, a query 212 takes place whether weaker road users (pedestrians, cyclists, etc.) want to cross or cross the road into which they wish to be inflected. If this is the case, it is held 261 and the query 212 is repeated. If there are no weaker road users, it is completely turned off (corresponds to destination point 111.4 in Fig. 6a ). If the maneuver of the motor vehicle is a right turn, the query 208 results in a corresponding result. Then, in the next step 211 to the sixth target point (target point 111.6 in FIG Fig. 6b ) hazards. There, the query 212 is executed accordingly. If there are no weaker road users, it is completely turned off 207 (corresponds to destination 111.7 in Fig. 6b ).
  • If the query 204 that the traffic light is turned off, this corresponds to the case in which the query 203 shows that no traffic light is present. Then it is checked in a query 213, whether a crosswalk is in front of the motor vehicle. If a zebra crossing exists, the motor vehicle is automatically driven 214 up to the first estimated destination point. Here the motor vehicle waits if necessary and it is checked in a query 215 whether there are weaker road users crossing the crosswalk or want to cross. Waiting 216 and retrying 215 is repeated. If there are no weaker road users (more) or if there is no crosswalk, a query 217 checks whether there are traffic signs which regulate the right of way.
  • If query 217 indicates that there are no traffic signs, then the "right-to-left" rule is adopted. The estimation of the first target point is then repeated by the scenario interpreter so that the first target point is further displaced 218. Next, a query 219 determines whether to turn left or right. If you want to turn left, it is checked in a query 220 whether other road users come from the right. If this is the case, it is stopped 221 and the query 220 is repeated. If there are no other road users (more) from the right side, then automation takes place to the second destination point (destination point 111.2 in Fig. 6a ) 222. There is checked in a query 223, if traffic comes from the opposite direction. If this is the case, it is held 260 and the query 223 repeated. If there is no traffic from the opposite direction (more), the motor vehicle is automatically automated up to the third estimated destination (destination point 111.3 in Fig. 6a 224. There, it is checked in a query 225 whether weaker road users want to cross or cross the road to be inflected. If weaker road users are present, it will be held 258 and the query 225 repeated. If there are no weaker road users (more) available, the motor vehicle is automatically automated up to the fourth estimated destination (destination point 111.4 in Fig. 6a ) 259.
  • If the query 219 results in a right turn, the scenario is simplified. It then becomes the sixth estimated target point (target point 111.6 in FIG Fig. 6b 226. There, it is checked in a query 227 whether weaker road users want to cross or cross the road to be inflected. If weaker road users are present, it is held 228 and the query 227 repeated. If there are no weaker road users (more) available, the system automates to the seventh estimated destination point (destination point 111.7 in Fig. 6b ) 229.
  • If the query 217, however, indicates that traffic signs are present, it is determined in a subsequent query 230 whether the motor vehicle has priority or not. If it does not enjoy priority, then the first estimated target point is re-estimated by the scenario interpretation device and thereby (eg further back towards the intersection) 231. Subsequently, a query 232 determines whether to turn left or right , If you want to turn to the left, then a subsequent query 233 determines whether there is traffic from the left or right. Is this the If so, it is stopped 234 and query 233 is repeated. If this is not the case, it is checked in a query 235 whether traffic from the opposite direction is present. If oncoming traffic is present, the motor vehicle is automatically automated up to the second estimated destination point (destination point 111.2 in FIG Fig. 6a ) 236 and query 235 repeated. If there is no oncoming traffic (more), then until the third estimated destination point (destination point 111.3 in Fig. 6a ) 237. In a query 238 it is then checked whether there are weaker road users who want to cross or cross the road to be inflected. If weaker road users are present, it is held 239 and the query 238 repeated. If there are no weaker road users (more), the estimated fourth destination point (destination point 111.4 in Fig. 6a ) driven 240.
  • If the query 232, however, that is to be bent to the right, it is checked in a query 241, whether traffic from the left or right exists. If there is cross traffic, it is held 242 and the query 241 is repeated. If there is no cross traffic (more), the estimated destination point (destination point 111.6 in Fig. 6b ) 243. Here is checked in a query 244, if weaker road users are present, who want to cross or cross the road to be inflected. If weaker road users are present, it will be held 245 and the query 244 repeated. If there are no weaker road users (more), then the estimated seventh destination (destination point 111.7 in Fig. 6b ) 246.
  • If the query 230 indicates that the motor vehicle has priority, it is determined in a following query 247 whether to turn to the left or to the right. Should be bent to the left, it is then checked in a query 248 whether traffic from the opposite direction is present. If oncoming traffic is present, the motor vehicle is automatically automated up to the second estimated destination point (destination point 111.2 in FIG Fig. 6a ) 249 and the query 248 repeated. If there is no oncoming traffic (more), the motor vehicle is automatically automated up to the third estimated destination point (destination point 111.3 in Fig. 6a 250). There, it is checked in a query 251 whether there are weaker road users who want to cross or cross the road into which they wish to be inflected. If weaker road users are present, it is held 252 and the query 251 is repeated. If there are no weaker road users (more), the estimated fourth destination point (destination point 111.4 in Fig. 6a ) 253.
  • If query 247 indicates that it is to be bent to the right, then the motor vehicle is automatically automated up to the sixth estimated target point (target point 111.6 in FIG Fig. 6b ) 254. There is checked in a query 255, if weaker road users are present, who want to cross or cross the road to be inflected. If weaker road users are present, 256 is held and poll 255 is repeated. If there are no weaker road users (more), then the estimated seventh destination (destination point 111.7 in Fig. 6b ) 257.
  • Upon reaching the respective last target point (target point 111.4 in Fig. 6a or destination point 111.7 in Fig. 6b ), the respective scenario is completely run through and thus ended.
  • Fig. 8 1 shows a schematic representation of a device 1 for the automated driving of a motor vehicle 50. The device 1 comprises an environment perception device 2, a scenario interpretation device 3, a maneuver planning device 4 and a controller 5.
  • An environment of the motor vehicle 50 is detected by means of a sensor 6 and provided as environment data 7. Such a sensor 6 may be, for example, a camera, a radar sensor, an ultrasonic sensor or the like. Furthermore, environment data 7 can also be provided via a communication interface 8, for example a Car2X interface, and / or an environment map 9. The environment data 7 are evaluated by the environment perception device 2 and objects detected in the environment of the motor vehicle. This can be done, for example, via a roadway recognition device 10 and / or an object recognition device 11. In the roadway recognition device 10 there is a fusion of the environment data 7, which relate to the roadway. Accordingly, in the object recognition device 11, a fusion of the environment data 7, which describe objects in the environment of the motor vehicle 50, takes place. Recognized objects can be, for example, other motor vehicles, weaker road users (pedestrians, cyclists, etc.) or even traffic signs. The data on the detected objects are forwarded by the environment perception device 2 to the scenario interpretation device 3. This evaluates the data and, based on the evaluated data, selects scenario 12 from a set of deployed scenarios. For example, a scenario may be a turn to the left or right (such as in the FIGS. 5a . 5b or Fig. 7 shown). The selected scenario 12 forms a hypothesis about the current traffic situation. It is always possible for the scenario interpretation means 3 to discard the current scenario 12 and select another or modified scenario if such a scenario is more likely based on new data from the environment perception means 2. For example, when approaching an intersection, a scenario for a T-junction may first be assumed, but then this scenario may be replaced by an X-junction, because in the Meanwhile, new (better) data has been provided by the environment perception device 2.
  • A current scenario 12 includes at least one elementary situation 13, which corresponds to an intermediate time step in the current scenario 12. The elementary situations 13 are connected with each other in terms of time. In this case, the next elementary situation 13 is transitioned if at least one of the elementary situation 13 is fulfilled as an associated release criterion. For checking the at least one release criterion, each elementary situation 13 is assigned a monitoring area as a further attribute. The monitoring area is queried by the scenario interpretation device 3 by request 20 at the environment perception device 2. The attributes for the current elementary situation 13 are transmitted to the environment perception device 2, which in turn can recognize the elementary situations 13 via a situation recognition device 19. By assigning special attributes, such as certain objects (traffic lights, traffic signs, ...) can certain objects are queried specifically at the environment perception device.
  • For each of the elementary situations 13, the scenario interpretation device 3 then estimates a destination point. The destination point is transferred to the maneuver planning device 4. The maneuver planning device 4 comprises a mission planner 14, which carries out the planning of the route (for example, which roads are to be traveled in the automated journey from the starting point to the destination) of the motor vehicle. Furthermore, the maneuver planning device 4 also assumes the planning of the concrete trajectories between the individual target points estimated by the scenario interpretation device 3 by means of a trajectory planning device 15. The specifically planned trajectories are transferred from the maneuver planning device 4 to the controller 5. The controller 5 then includes, for example, a control device 16 which controls or controls at least one actuator 17 of the motor vehicle 50, so that the motor vehicle 50 automatically travels along the concrete trajectory.
  • Optionally, the device 1 may comprise a prediction device 22 which predicts future trajectories of detected objects within the at least one monitoring region of an elementary situation. In this case, for example, also map data, traffic control, as well as movement data, eg a position, an orientation, an insert of a turn signal, can be included in the detected objects. To estimate the trajectories, for example, the following approaches can be used: Mixtures of Gaussian, Kalman filters, Monte Carlo simulations, Gaussian Process, Logistic Regression, Relevance Vector Machines, Support Vector, Hidden Markov Models and Bayesian Networks.
  • The device or parts thereof can also be designed as a combination of hardware and software, for example in the form of a microcontroller or microprocessor on which a corresponding computer program is executed.
  • A schematic sequence of elementary situations in a left-turn scenario, as performed by the scenario interpretation engine, is described in US Pat FIGS. 9a to 9d shown.
  • In Fig. 9a a mission planner provides information about suitable lanes and the motor vehicle 50 makes a lane change to an appropriate lane. The motor vehicle 50 then approaches the intersection 61. The scenario interpretation device recognizes a left turn maneuver based on the data from the surroundings perception device and selects the corresponding scenario. Subsequently, the scenario interpretation device inquires at the environment perception device whether a traffic light exists. If there is no traffic light, the scenario interpretation device additionally needs information about the applicable right of way regulation (left-to-right or traffic signs, etc.). In this example, the environment perception device recognizes a STOP sign 21, so that a scenario with a corresponding basic order of elementary situations is selected (see also the procedure in FIG Fig. 7 ). The following elementary situations each represent hypotheses about the future course of the scenario. The hypotheses can then be confirmed or disproved by the environment perception device.
  • The first elementary situation ( Fig. 9a ) requests at the environment perception device the information as to whether weaker road users (pedestrians or cyclists) are present in the monitoring area 110.1 assigned to the first elementary situation. Depending on whether weaker road users are present or not, the scenario interpretation device estimates a first destination point 111.1. Since the scenario interpretation device is already known in advance, where monitoring areas 110.2 of the next elementary situation ( Fig. 9b ), information about vehicles coming from the left and the right is requested at the environment perception facility.
  • The scenario interpretation device does not change the destination point 111.1 as long as pedestrians or bicycles still want to cross or cross the street. Otherwise, the destination point 111.1 is moved forward (before a potential conflict point with a motor vehicle coming from the left). Furthermore, information about the monitoring area 110.2 in the surroundings perception device is queried.
  • In the next elementary situation ( Fig. 9c ) oncoming motor vehicles are taken into account. Accordingly, the surveillance area 110.3 is selected for this elementary situation. If there is no cross traffic, the next destination point 111.2 is estimated by the scenario interpretation facility. The estimated destination point 111.2 lies before a possible point of conflict with an oncoming motor vehicle. The motor vehicle 50 is then automatically moved to the destination point 111.2.
  • If there is no oncoming traffic, the next elementary situation ( Fig. 9d ). For this purpose, a destination point 111.3 is estimated, which is approached automatically by the motor vehicle 50. In this elementary situation, it is necessary to wait for weaker road users, who want to cross or cross the road into which they wish to turn. Accordingly, the monitoring area 110.4 is defined. If there are no weaker road users (more), the scenario is ended by completing the turn.
  • LIST OF REFERENCE NUMBERS
  • 1
    contraption
    2
    Environment sensing device
    3
    Scenario interpreter
    4
    Maneuver planning device
    5
    control
    6
    sensors
    7
    environmental data
    8th
    Communication Interface
    9
    environment map
    10
    Lane recognition device
    11
    Object recognition device
    12
    scenario
    13
    elementary situation
    14
    mission planners
    15
    Trajektorienplanungseinrichtung
    16
    control device
    17
    actuators
    21
    Stop sign
    22
    prediction
    50
    motor vehicle
    51
    another motor vehicle
    52
    another motor vehicle
    53
    another motor vehicle
    60
    traffic situation
    61
    crossing
    62.1
    Street
    62.2
    Street
    63
    lane
    64.1
    zebra crossing
    64.2
    zebra crossing
    65
    "Note right of way" sign
    66
    give way sign
    71
    point of conflict
    72
    point of conflict
    73
    point of conflict
    74
    point of conflict
    75
    point of conflict
    81
    pedestrian
    82
    pedestrian
    83
    pedestrian
    90
    trajectory
    91
    trajectory
    92
    trajectory
    93
    trajectory
    100
    scenario
    101
    elementary situation
    101.0
    elementary situation
    101.1
    elementary situation
    101.2
    elementary situation
    101.3
    elementary situation
    101.4
    elementary situation
    101.5
    elementary situation
    101.6
    elementary situation
    101.7
    elementary situation
    101.X
    elementary situation
    102
    scene
    103
    Condition of the motor vehicle
    104
    scenery
    105
    dynamic elements
    106
    Action of the driver
    107
    automated action
    108
    attribute
    109
    hypothesis
    110
    monitoring area
    110.1
    monitoring area
    110.2
    monitoring area
    110.3
    monitoring area
    110.4
    monitoring area
    110.5
    monitoring area
    110.6
    monitoring area
    110.7
    monitoring area
    111.1
    Endpoint
    111.2
    Endpoint
    111.3
    Endpoint
    111.4
    Endpoint
    111.5
    Endpoint
    111.6
    Endpoint
    111.7
    Endpoint
    112.1
    trajectory
    112.2
    trajectory
    112.3
    trajectory
    112.4
    trajectory
    112.5
    trajectory
    112.6
    trajectory
    112.7
    trajectory
    113
    object
    114
    release criterion
    115
    correction
    200-261
    Queries and actions in a scenario

Claims (10)

  1. Method for automatically driving a motor vehicle (50), comprising the following steps:
    Provision of environmental data (7),
    Detecting objects (113) in the environment data (7) by an environment perception device (2),
    Selecting a current scenario (12, 100) from a provided set of known scenarios (12, 100) by scenario interpretation means (3) based on the detected objects (113), wherein a scenario (12, 100) is at least one elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7,
    101.X), and wherein each elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7, 101.X) has as attributes (108) at least one monitoring area (110.1, 110.2, 110.2 , 110.3, 110.4, 110.5, 110.6, 110.7) and an enabling criterion (114) are assigned,
    also in succession for each of the elementary situations (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7, 101.X) of the current scenario (12, 100):
    Estimating a target point (111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7) for the currently existing elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7, 101.X) the scenario interpretation device (3), planning a trajectory (112.1, 112.2, 112.3, 112.4, 112.5, 112.6, 112.7) to the destination (111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7) by a maneuver planning device (4) .
    Retrieving information about the monitoring area (110.1, 110.2, 110.2, 110.3, 110.4, 110.5, 110.6, 110.7) at the environment perception device (2) by the scenario interpretation device (3), checking for the existence of the at least one elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7, 101.X ) on the basis of the requested information by the scenario interpretation means (3), and
    if the at least one release criterion (114) is fulfilled,
    automated driving along the trajectory (112.1, 112.2, 112.3, 112.4, 112.5, 112.6, 112.7) to the destination point (111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7) Controlling at least one actuator (17) of the motor vehicle (50) by a controller (5).
  2. Method according to one of the preceding claims, characterized in that it is continuously checked by the scenario interpretation device (3) whether the selected current scenario (12, 100) is still present or a change of the current scenario (12, 100) is necessary.
  3. Method according to one of the preceding claims, characterized in that at least one elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7, 101.X) has a further attribute (108), wherein the another attribute (108) is a specific object (113) whose presence is queried by the scenario interpretation device (3) at the environment perception device (2).
  4. Method according to one of the preceding claims, characterized in that the estimation of the target point (111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7) is dynamically adapted by the scenario interpretation device (3) on the basis of the environment perception device (2). requested information.
  5. Device (1) for the automated driving of a motor vehicle (50), comprising:
    an environment perception device (2) for recognizing objects (113) in provided environment data (7),
    scenario interpretation means (3) for selecting a current scenario (12, 100) from a provided set of known scenarios (12, 100) based on the detected objects (113), wherein a scenario (12, 100) is at least one elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7, 101.X), and wherein each elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6 , 101.7, 101.X) as attributes (108) at least one monitoring area (110.1, 110.2, 110.2, 110.3, 110.4, 110.5, 110.6, 110.7) and an enabling criterion (114) are assigned,
    a maneuver planning device (4) and
    a controller (4),
    wherein the scenario interpretation device (3) is designed such that it has a target point (111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7) for a currently existing elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7, 101.X), information about the elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.X)
    Monitoring area (110.1, 110.2, 110.2, 110.3, 110.4, 110.5, 110.6, 110.7) of the environment perception device (2) and a presence of the at least one release criterion (114) of the current elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4, 101.5, 101.6, 101.7, 101.X) on the basis of the requested information,
    and wherein the controller (5) is designed such, provided that the at least one release criterion (114) is met, the motor vehicle (50) along a planned by the maneuver planning device (4) trajectory (112.1, 112.2, 112.3, 112.4, 112.5, 112.6, 112.7 ) to drive to the destination (111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7) by controlling at least one actuator (17) of the motor vehicle (50),
    and the scenario interpretation device (3) is further designed such that, after reaching the destination point (111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7), a next elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.4 , 101.5, 101.6, 101.7, 101.X) of the current scenario (12, 100).
  6. Device (1) according to claim 5, characterized in that the device (1) comprises at least one sensor (6) which detects the environment of the motor vehicle (50) and provides the environment data (7).
  7. Device (1) according to claim 5 or 6, characterized in that the device (1) has a memory and the environment data (7) are provided alternatively or additionally in the form of a stored environment map (9).
  8. Device (1) according to one of claims 5 to 7, characterized in that the device (1) has a communication interface (8) and the environment data (7) are provided alternatively or additionally via the communication interface (8).
  9. Device (1) according to one of claims 5 to 8, characterized in that the device (1) comprises a prediction device (22), wherein the prediction device (22) is designed such within the at least one monitoring area (110.1, 110.2, 110.2, 110.3, 110.4, 110.5, 110.6, 110.7) of an elementary situation (13, 101, 101.1, 101.2, 101.2, 101.3, 101.X) predict future trajectories of recognized objects (113).
  10. Device (1) according to claim 9, characterized in that a release criterion (114) is that a trajectory (112.1, 112.2, 112.3, 112.4, 112.5, 112.6, 112.7) of the motor vehicle (50) to the next target point (111.1, 111.2, 111.3, 111.4, 111.5, 111.6, 111.7) is collision-free with trajectories of other objects (113).
EP16197937.2A 2015-12-04 2016-11-09 Method and device in a motor vehicle for automated driving Pending EP3176046A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102015224338.9A DE102015224338A1 (en) 2015-12-04 2015-12-04 Method and device in a motor vehicle for automated driving

Publications (1)

Publication Number Publication Date
EP3176046A1 true EP3176046A1 (en) 2017-06-07

Family

ID=57281092

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16197937.2A Pending EP3176046A1 (en) 2015-12-04 2016-11-09 Method and device in a motor vehicle for automated driving

Country Status (4)

Country Link
US (1) US10173675B2 (en)
EP (1) EP3176046A1 (en)
CN (1) CN106871915A (en)
DE (1) DE102015224338A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019002160A1 (en) * 2017-06-30 2019-01-03 Siemens Aktiengesellschaft Context-based autonomous control of a vehicle
DE102019205481A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Environment detection by means of a sensor with a variable detection area

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107179767A (en) * 2016-03-10 2017-09-19 松下电器(美国)知识产权公司 Steering control device, driving control method and non-transient recording medium
US10202118B2 (en) * 2016-10-14 2019-02-12 Waymo Llc Planning stopping locations for autonomous vehicles
US10490066B2 (en) * 2016-12-29 2019-11-26 X Development Llc Dynamic traffic control
KR20180095240A (en) * 2017-02-17 2018-08-27 현대자동차주식회사 Apparatus for controlling competition of an autonomous vehicle, system having the same and method thereof
DE102017006338A1 (en) * 2017-06-30 2019-01-03 Daniel Karch Method for efficient validation and secure application of autonomous and semi-autonomous vehicles
DE102017213353A1 (en) * 2017-08-02 2019-02-07 Bayerische Motoren Werke Aktiengesellschaft Verification of a planned trajectory of an automated motor vehicle
DE102017118651A1 (en) * 2017-08-16 2019-02-21 Valeo Schalter Und Sensoren Gmbh Method and system for collision avoidance of a vehicle
DE102018202966A1 (en) * 2018-02-28 2019-08-29 Robert Bosch Gmbh Method for operating at least one automated vehicle
US10782699B2 (en) * 2018-03-10 2020-09-22 Baidu Usa Llc Real-time perception adjustment and driving adaption based on surrounding vehicles' behavior for autonomous driving vehicles
US20190302768A1 (en) * 2018-04-03 2019-10-03 Baidu Usa Llc Perception and planning collaboration framework for autonomous driving
DE102018208105B3 (en) * 2018-05-23 2019-03-21 Volkswagen Aktiengesellschaft A method for supporting a guidance of at least one motor vehicle and assistance system
DE102018209031A1 (en) * 2018-06-07 2019-12-12 Robert Bosch Gmbh Method and apparatus for operating an automated vehicle at an intersection
DE102018209978A1 (en) * 2018-06-20 2019-12-24 Robert Bosch Gmbh Automatic crossing of an intersection area
CN109060370B (en) * 2018-06-29 2019-12-10 奇瑞汽车股份有限公司 Method and device for vehicle testing of automatically driven vehicle
DE102018222492A1 (en) * 2018-12-20 2020-06-25 Robert Bosch Gmbh Method and device for operating an automated vehicle
CN109774716A (en) * 2019-01-16 2019-05-21 北京百度网讯科技有限公司 Control method for vehicle and device
DE102019204260A1 (en) * 2019-03-27 2020-10-01 Zf Friedrichshafen Ag Controlling a motor vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005020429A1 (en) * 2005-04-29 2006-11-09 Daimlerchrysler Ag Method and device to support driver when crossing an intersection divide crossing into many zones arranged on the street and determine the navigability of each on the basis of surrounding information
DE102009008745A1 (en) 2009-02-12 2010-08-19 Volkswagen Ag Method for automatic traffic routing of motor vehicle, involves transmitting traffic routing data from surrounding field model to corresponding road user for traffic routing by central control unit
DE102010010856A1 (en) 2010-03-10 2011-09-15 Continental Teves Ag & Co. Ohg Method for automatic supporting of driver of motor vehicle with its driving situation, involves detecting vehicle environment and producing electronic image, where electronic image is applied for recognition of lane or track
DE102010018333A1 (en) 2010-04-27 2011-10-27 Gm Global Technology Operations Llc (N.D.Ges.D. Staates Delaware) Method for evaluating information of image that is captured by optical camera in driver assisting system in e.g. passenger car, involves evaluating information that contains in determined image region, which comprises track course portion
US8761991B1 (en) * 2012-04-09 2014-06-24 Google Inc. Use of uncertainty regarding observations of traffic intersections to modify behavior of a vehicle
WO2015008588A1 (en) * 2013-07-19 2015-01-22 日産自動車株式会社 Drive assist device for vehicle, and drive assist method for vehicle

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102059978B (en) * 2009-11-16 2014-09-10 财团法人工业技术研究院 Assisted method and system for driving
US8744626B2 (en) * 2010-05-27 2014-06-03 Deere & Company Managing autonomous machines across multiple areas
DE102012002307A1 (en) * 2012-02-06 2013-08-08 Audi Ag Motor vehicle with a driver assistance device and method for operating a motor vehicle
US9253753B2 (en) * 2012-04-24 2016-02-02 Zetta Research And Development Llc-Forc Series Vehicle-to-vehicle safety transceiver using time slots
FR2996512B1 (en) * 2012-10-05 2014-11-21 Renault Sa Method for evaluating the risk of collision at an intersection
DE102013216263A1 (en) * 2013-08-16 2015-02-19 Continental Automotive Gmbh Arrangement for controlling a highly automated driving of a vehicle
JP6241341B2 (en) * 2014-03-20 2017-12-06 アイシン・エィ・ダブリュ株式会社 Automatic driving support device, automatic driving support method and program
CN103921788B (en) * 2014-04-02 2018-03-16 奇瑞汽车股份有限公司 A kind of running car control system and method
EP2950114B1 (en) * 2014-05-30 2020-03-04 Honda Research Institute Europe GmbH Method for assisting a driver in driving a vehicle, a driver assistance system, a computer software program product and vehicle
CN107438754A (en) * 2015-02-10 2017-12-05 御眼视觉技术有限公司 Sparse map for autonomous vehicle navigation
US9754490B2 (en) * 2015-11-04 2017-09-05 Zoox, Inc. Software application to request and control an autonomous vehicle service

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005020429A1 (en) * 2005-04-29 2006-11-09 Daimlerchrysler Ag Method and device to support driver when crossing an intersection divide crossing into many zones arranged on the street and determine the navigability of each on the basis of surrounding information
DE102009008745A1 (en) 2009-02-12 2010-08-19 Volkswagen Ag Method for automatic traffic routing of motor vehicle, involves transmitting traffic routing data from surrounding field model to corresponding road user for traffic routing by central control unit
DE102010010856A1 (en) 2010-03-10 2011-09-15 Continental Teves Ag & Co. Ohg Method for automatic supporting of driver of motor vehicle with its driving situation, involves detecting vehicle environment and producing electronic image, where electronic image is applied for recognition of lane or track
DE102010018333A1 (en) 2010-04-27 2011-10-27 Gm Global Technology Operations Llc (N.D.Ges.D. Staates Delaware) Method for evaluating information of image that is captured by optical camera in driver assisting system in e.g. passenger car, involves evaluating information that contains in determined image region, which comprises track course portion
US8761991B1 (en) * 2012-04-09 2014-06-24 Google Inc. Use of uncertainty regarding observations of traffic intersections to modify behavior of a vehicle
WO2015008588A1 (en) * 2013-07-19 2015-01-22 日産自動車株式会社 Drive assist device for vehicle, and drive assist method for vehicle
EP3023963A1 (en) * 2013-07-19 2016-05-25 Nissan Motor Co., Ltd Drive assist device for vehicle, and drive assist method for vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JAVIER ALONSO ET AL: "Autonomous vehicle control systems for safe crossroads", TRANSPORTATION RESEARCH. PART C, EMERGING TECHNOLOGIES, PERGAMON, NEW YORK, NY, GB, vol. 19, no. 6, 5 June 2011 (2011-06-05), pages 1095 - 1110, XP028289470, ISSN: 0968-090X, [retrieved on 20110630], DOI: 10.1016/J.TRC.2011.06.002 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019002160A1 (en) * 2017-06-30 2019-01-03 Siemens Aktiengesellschaft Context-based autonomous control of a vehicle
DE102019205481A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Environment detection by means of a sensor with a variable detection area

Also Published As

Publication number Publication date
US20170158193A1 (en) 2017-06-08
US10173675B2 (en) 2019-01-08
CN106871915A (en) 2017-06-20
DE102015224338A1 (en) 2017-06-08

Similar Documents

Publication Publication Date Title
US10514705B2 (en) Constraint augmentation in a navigational system
US10407059B2 (en) Automated vehicle parking
US10303166B2 (en) Supervisory control of vehicles
US10303173B2 (en) Facilitating rider pick-up for a self-driving vehicle
US10379533B2 (en) System and method for autonomous vehicle fleet routing
US10235885B2 (en) Autonomous vehicle driving system and method
US9969326B2 (en) Intention signaling for an autonomous vehicle
JP2017165407A (en) System and method for parking vehicle near obstacles
US9990548B2 (en) Traffic signal analysis system
RU2726238C2 (en) Self-contained vehicle with direction support
KR20190107071A (en) Rule based navigation
DE102016220945A1 (en) Method and device for supporting a maneuvering process of a motor vehicle
Ardelt et al. Highly automated driving on freeways in real traffic using a probabilistic framework
US20170329332A1 (en) Control system to adjust operation of an autonomous vehicle based on a probability of interference by a dynamic object
US9442489B2 (en) Method for coordinating the operation of motor vehicles that drive in fully automated mode
US10037553B2 (en) Selecting vehicle type for providing transport
DE102016211183A1 (en) Method, device and system for carrying out an automated journey of a vehicle with the participation of at least one further vehicle
US20190056742A1 (en) Autonomous vehicle operated with safety augmentation
KR101703144B1 (en) Apparatus and method for autonomous driving
US9933779B2 (en) Autonomous vehicle operated with guide assistance of human driven vehicles
DE102015002405A1 (en) Method for traffic coordination of motor vehicles in a parking environment
WO2015156146A1 (en) Travel control device, onboard display device, and travel control system
CA3029742A1 (en) Autonomous vehicle control using submaps
CN107074282B (en) Method and apparatus for running vehicle
US10317899B2 (en) Intervention in operation of a vehicle having autonomous driving capabilities

Legal Events

Date Code Title Description
AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AV Request for validation of the european patent

Extension state: MA MD

AX Request for extension of the european patent to:

Extension state: BA ME

17P Request for examination filed

Effective date: 20171207

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR