EP3928127A1 - Détection de l'environnement côté infrastructure lors de la conduite autonome - Google Patents

Détection de l'environnement côté infrastructure lors de la conduite autonome

Info

Publication number
EP3928127A1
EP3928127A1 EP20725093.7A EP20725093A EP3928127A1 EP 3928127 A1 EP3928127 A1 EP 3928127A1 EP 20725093 A EP20725093 A EP 20725093A EP 3928127 A1 EP3928127 A1 EP 3928127A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
environment model
data
sensor data
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20725093.7A
Other languages
German (de)
English (en)
Inventor
Dominic BERGES
Dominik Zoeke
Marcus Zwick
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Siemens Mobility GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Mobility GmbH filed Critical Siemens Mobility GmbH
Publication of EP3928127A1 publication Critical patent/EP3928127A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/91Radar or analogous systems specially adapted for specific applications for traffic control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4804Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations

Definitions

  • the invention relates to a method for generating a field model of an autonomously controlled vehicle.
  • the invention also relates to a method for the autonomous control of a vehicle.
  • the invention also relates to an environment model generation device.
  • the invention also relates to a vehicle control device.
  • the invention also relates to an autonomously controlled vehicle.
  • the invention also relates to an automated transport system.
  • Autonomous driving is understood to be the independent driving, control and parking of a vehicle without human influence.
  • Autonomous driving requires precise knowledge of the position and speed both of the autonomously controlled vehicle itself and of objects located in the vicinity of a route of an autonomously controlled vehicle. Therefore, in the context of autonomous vehicles, the perception of the surroundings and the localization and detection of objects as well as the autonomous vehicle itself are of outstanding importance. If the surroundings are only detected by the autonomous vehicle itself, this detection is limited to the detection sensors of the autonomous vehicle.
  • the sensors used in vehicles, such as lidar, radar, ultrasonic sensors and cameras have limitations in terms of their range, real-time capability, coverage of the surrounding area and performance, especially in different weather conditions, such as strong sunlight, rain and
  • shading areas caused by trucks or cars can occur during the vehicle-side environment recognition, which cannot be perceived by one's own environment detection.
  • major decisions when detecting objects, ie an object is detected by a majority of the available sensors, but not by a minority. It is then accepted that an object has been detected. It would be desirable to have the To increase the number of sensors contributing to such a decision beyond the number of sensors directly available to an autonomous vehicle. Infrastructural sensors could then also be consulted for such a decision two sensors of the vehicle detect an object but two sensors of two existing sensors on the side of the infrastructure detect the object, then it would also be accepted that an object was detected.
  • Some situations also require a "look around the corner", for example driving on an X-junction or T-junction. Difficult situations, such as entrances and exits from underpasses or bridges, turning maneuvers, overtaking maneuvers, etc., cannot be safely dealt with by sensors on the vehicle either.
  • Another problem is that an ever more complex sensor system and an increasingly complex environment modeling algorithm are used in a powerful environment detection. This is associated with immense costs for the individual vehicles. For example, up to 12 lidar devices are currently installed, the unit price of which is between 6,000 and 65,000 euros.
  • the speed of the autonomous vehicle is greatly reduced in order to ensure the safety of the vehicle, or emergency braking occurs, as a result of which occupants in autonomous vehicles are endangered. Attempts should therefore be made to avoid such emergency braking.
  • a safety driver is required, who forms a fallback level of the system.
  • the object is therefore to provide a method and a device for improved environment recognition and localization of autonomous vehicles.
  • This object is achieved by a method for generating an environment model of an autonomously controlled vehicle according to claim 1, a method for autonomous control of a vehicle according to claim 10, an environment model generation device according to claim 11, a vehicle control device according to claim 12, an autonomously controlled vehicle according to claim 13 and an automated transport system according to claim 14 solved.
  • sensor data are recorded by a plurality of infrastructure-side sensors in a surrounding area of the vehicle.
  • the merged sensor data are generated on the basis of the recorded sensor data.
  • environment model data are generated in a stationary manner by evaluating the sensor data on the infrastructure side, objects located in the environment area being localized and be identified. Finally, the environment model data are transmitted to the vehicle.
  • the sensors on the infrastructure side work on the principle of geofencing. If an autonomously controlled vehicle enters a communication area of the infrastructure-side monitoring, communication takes place between the infrastructure and the autonomous vehicle. As part of this communication, the vehicle receives information about the permanently defined surrounding area monitored by the sensors and objects that are present or moving therein. An environment model of the autonomous vehicle can be created or supplemented on the basis of this information. If the autonomous vehicle drives into the detection area of the infrastructure-side sensors, it is detected and this information is integrated into the environment model of the area monitored by the sensors. If the autonomously controlled vehicle leaves the communication area, communication between the infrastructure-side monitoring units and the relevant autonomously controlled vehicle ends. The communication area and the monitoring area do not have to, but can also be identical.
  • an environment model is generated by using the method according to the invention for generating an environment model of an autonomously controlled vehicle.
  • the environment model can be transmitted statically for a defined so-called traffic cell in which the autonomous vehicle is currently located.
  • the current vehicle position can be used to dynamically track the transmitted environment model section.
  • the environment model generating device has infrastructure-side sensors for acquiring sensor data in the area surrounding the sensors.
  • Part of the environment model generation device according to the invention is also a stationary fusion unit for merging the sensor data.
  • the environment model generation device according to the invention also has an evaluation unit for the stationary generation of environment model data on the basis of the merged sensor data. Objects located in the surrounding area are localized and identified.
  • Part of the environment model generation device according to the invention is also a stationary communication unit for transmitting the environment model data to an autonomously controlled vehicle.
  • the environmental mo- Dent generation device shares the advantages of the method according to the invention for generating an environment model of an autonomously controlled vehicle.
  • the vehicle control device has a vehicle-side communication unit for receiving field model data from the stationary environment model generation device. Part of the vehicle control device according to the invention is also a control unit for the automated control of the driving behavior of a vehicle on the basis of the environment model generated by the stationary environment model generation device according to the invention.
  • the vehicle control device shares the advantages of the method according to the invention for autonomous control of a vehicle.
  • the autonomously controlled vehicle according to the invention comprises the vehicle control device according to the invention.
  • the autonomously controlled vehicle according to the invention shares the advantages of the vehicle control device according to the invention.
  • the automated transport system according to the invention has a surrounding model generating device arranged on the infrastructure side according to the invention and at least one autonomously controlled vehicle according to the invention.
  • An object localization can advantageously be supplemented by the infrastructure or at least partially outsourced to it, so that the performance of the object localization and object recognition of an autonomously controlled vehicle is improved.
  • Some components of the environment model generation device according to the invention, the vehicle control device according to the invention, the autonomously controlled vehicle according to the invention and the automated transport system according to the invention can for the most part be designed in the form of software components. This applies in particular to parts of the stationary fusion unit, the evaluation unit and the control unit. Basically, these components can but also in part, especially when it comes to particularly fast calculations, in the form of software-supported hardware, for example FPGAs or the like. Likewise, the required interfaces, for example if it is only a matter of transferring data from other software components, can be designed as software interfaces. But they can also be configured as hardware-based
  • Interfaces can be formed that are controlled by suitable software.
  • a largely software-based implementation has the advantage that computer systems already present in a mobile object or in infrastructure can easily be retrofitted with a software update after a possible addition by additional hardware elements, such as sensors and communication units. to work in the manner of the invention.
  • the object is also achieved by a corresponding computer program product with a computer program that can be loaded directly into a memory device of such a computer system, with program sections to carry out the steps of the method according to the invention that can be implemented by software when the computer program is executed in the computer system.
  • such a computer program product can optionally contain additional components such as documentation and / or additional components, including hardware components, such as Hardware keys (dongles etc.) for using the software.
  • additional components such as documentation and / or additional components, including hardware components, such as Hardware keys (dongles etc.) for using the software.
  • a computer-readable medium for example a memory stick, a hard disk or some other transportable or permanently installed data carrier on which the program sections of the computer program that can be read and executed by a computer unit can be used for transport to the memory device of the computer system and / or for storage on the computer system are stored.
  • the computing unit can, for example, have one or more cooperating microprocessors or the like for this purpose.
  • the sensors include radar sensors and / or lidar sensors.
  • radar waves preferably have wavelengths of a few millimeters to centimeters. Radar offers complete volume illumination without blind spots, but with a comparable sensor size, a lower selectivity is achieved than with lidar. But radar is robust in almost all weather conditions, such as rain, snow, fog, darkness or direct sunlight.
  • Lidar has the property of being able to display object edges more precisely than radar. However, Lidar has a very limited field of vision and is also more susceptible to adverse weather conditions.
  • the two detection technologies can also be advantageously combined in order to combine the various advantages of the different technologies described with one another and to compensate for the disadvantages described.
  • the sensors can also have cameras. With cameras, an extensive area can be recorded and monitored simultaneously with good resolution.
  • a combination of radar and / or lidar with cameras as sensors can also be used advantageously.
  • the environment model data are preferably generated object-based, with attributes being assigned to individual objects.
  • the at tributes include information on relevant properties of the detected objects. These properties enable a hazard assessment in connection with a detected object and they allow the behavior of a detected object to be predicted to a certain extent.
  • the named attributes of an object can include at least one of the following attribute types:
  • Further attributes can include time stamps, an identification number, the length, width and height of a vehicle or object, contours, parameters close to raw data, such as the number of reflection points or a security level or reliability level of transmitted data.
  • the reliability of the data can depend, for example, on whether they were estimated or otherwise determined.
  • the environment model data include a grid-based occupancy map.
  • a grid-based occupancy map also referred to as an "occupancy grid map” divides the monitored area into a plurality of cells, the occupancy probability of which is determined on the basis of sensor data.
  • An object can be identified on the basis of the position and number of occupied cells in the grid or grid of the grid-based occupancy map, and a detection probability value of an object can be specified on the basis of the probability of the cells being occupied.
  • the surrounding area is divided into sub-areas with different priorities.
  • Areas with different priorities can, for example, have different requirements for real-time capabilities of sensor detection, different security requirements and system requirements derived from them.
  • a high priority may require the use of different sensors with different performance in different weather conditions, a high level of redundancy in the detection of objects and the processing of sensor data, high real-time requirements and a high level of accuracy and resolution in the area detection.
  • a significant proportion of the processing of the sensor data is preferably carried out in each sensor itself, taking into account the specific sensor properties. No further prior knowledge is therefore required to generate the occupancy card. This reduces the amount of data collected by all sensors and the computing load for creating the occupancy card. However, this can be at the expense of the accuracy of the result, since information can be lost during the sensor-internal processing, which information can be helpful for a possible data fusion when creating the occupancy card.
  • the grid-based occupancy map is preferably generated on the basis of the object-based environment model data.
  • the amount of data for generating the grid-based occupancy rate is relatively small, so that the grid-based occupancy map can be created very quickly.
  • the grid-based occupancy card is particularly preferably generated directly on the basis of the sensor data.
  • the captured data can be processed particularly precisely, since no intermediate steps can lead to a loss of information.
  • the processing takes place here on the basis of sensor data, which are composed in such a way that no information that is helpful for increasing the accuracy of the data fusion is lost due to any preprocessing.
  • the amount of data is not necessarily increased in this case, however, certain sensor properties or prior knowledge of the procurement unit of the sensor, such as the resolution, the measurement accuracy, etc., must be known and taken into account during the fusion.
  • sensor data recorded on the vehicle side from the area surrounding the vehicle are also used to generate the environment model data.
  • a database for the environment model of the vehicle can advantageously be expanded, so that the reliability, resolution and completeness of the environment model is improved.
  • FIG. 1 shows a flow diagram which illustrates a method for generating an environment model of an autonomously controlled vehicle according to an exemplary embodiment of the invention
  • FIG. 2 shows a schematic representation of an automated transport system
  • 3 shows a schematic representation of a stationary monitored straight route with areas with different priorities
  • FIG. 5 shows a schematic representation of a layer model of determined environment data
  • FIG. 6 shows a schematic representation which illustrates the creation of a digital twin of a node or an intersection
  • FIG. 7 shows an environment model generating device according to an exemplary embodiment of the invention
  • FIG. 8 shows a schematic representation of a vehicle control device.
  • FIG. 1 shows a flow diagram 100 which illustrates a method for generating an environment model MD of an autonomously controlled vehicle F according to an exemplary embodiment of the invention.
  • step 1. I sensor data SD are recorded by a plurality of infrastructure-side sensors in an area surrounding the vehicle F.
  • the sensors used to acquire the sensor data can include, for example, lidar sensors, radar sensors and cameras.
  • the sensors are arranged on sensor masts at regular intervals along a driveway or around an intersection or junction and each monitor a closer environment.
  • step 1.II the sensor data SD from different sensors are merged, so that merged sensor data FSD are generated on the basis of the sensed sensor data.
  • step 1.III an environment model MD of a street section, for example an intersection or a straight line, becomes stationary Section, based on the merged sensor data. In the context of the environment model, rich objects located in the environment are localized and identified.
  • the environment model data MD are also transmitted to the vehicle F.
  • the vehicle F enriches its own environment model MDF with the stationary environment model MD, ie it supplements its environment model MDF with the environment model data MD obtained, which were recorded and generated in a stationary manner.
  • FIG. 2 shows a schematic representation 20 of an arrangement made up of a vehicle-side and an infrastructure-side sensor system.
  • the infrastructure 23 comprises sensor units 22a, 22b with sensors A2, with which a road section 21 is monitored on which a vehicle F is on the way.
  • Typical sensors are, for example, lidar sensors, radar sensors and optical sensors, such as cameras, with which data from the environment of the infrastructure is recorded.
  • the recorded data is also evaluated, checked for plausibility and exchanged.
  • a defined area is monitored on the infrastructure side by the devices 22a, 22b mentioned, which is also referred to as GeoFence. This area is divided into a core area with high priority and an area with low priority. The monitoring of the different areas mentioned meets different requirements in terms of latency, detection probability, etc.
  • the sensor units 22a, 22b each have communication antennas A1, with which they can communicate with other units, such as an evaluation device 22d or with the vehicle F, with the aid of a communication unit 22c on the roadside.
  • the vehicle F also has a sensor unit A2 with which it can monitor its surroundings.
  • the vehicle F also an antenna A1, with which it can communicate with the sensor units 22a, 22b or the evaluation device 22d, if necessary via the infrastructure-side communication unit 22c.
  • an autonomous vehicle F has an on-board unit (not shown) for communication, with which it can communicate via the antenna A1 with an infrastructure-side communication unit 22c.
  • the infrastructure-side communication unit 22c transmits its information periodically by radio. For example, the IEEE 802.11 standard or car-to-X can be used for transmission.
  • the vehicle F receives information relating to an environment model of the infrastructure.
  • the information on the surroundings can be transmitted, for example, to all road users in the catchment area of the infrastructure-side communication unit 22c or the traffic cell assigned to it. If an environment model is to be dynamically adapted for individual vehicles, identification can take place via bidirectional communication.
  • FIG. 3 shows an example of a straight stretch 30 which is divided into a core area 31a with high priority and an outer area 31b with low priority.
  • the core area 31a is formed by the driving area of the route 30 for vehicles F and has particularly high safety requirements. These are met by an increased number of different sensors 22a, 22b that monitor this core area rich.
  • the different sensors complement each other through their properties in terms of detection accuracy and availability in different weather conditions and have redundancy in the acquisition and processing of the monitoring data.
  • the real-time requirements as well as the accuracy and resolution of the environment detection in the core area are significantly higher than in the Periphery.
  • the required sensors and evaluation devices are selected and arranged accordingly.
  • a junction 40 is shown in the road network. This also has a core area 41a with high priority and a peripheral area 41b with lower priority. The junction 40 is with the help of three sensor units 22a,
  • monitoring data are transmitted from a communication unit 22c by radio to vehicles F in the transmission range.
  • FIG. 5 shows a layer model 50 for an environment model with five layers.
  • Four-layer models are already known as “local dynamic map”.
  • the conventional four-layer model is supplemented by an additional layer S5 can be transferred to a vehicle as a compact representation of a surrounding situation.
  • the lowest layer S1 comprises so-called permanent static data, such as map data.
  • the second lowest layer S2 comprises transient static data, such as the roadside infrastructure. This includes street signs, landmarks, etc.
  • the third layer S3 comprises the transient dynamic data. These include, for example, traffic data and data about the signal phase or data about the current weather-related road conditions.
  • the fourth layer S4 comprises highly dynamic data, such as vehicles and pedestrians.
  • the traffic participants detected by the sensors such as pedestrians P, cyclists, vehicles F and the associated estimated attributes, such as their position. position, length, speed, detection probability and type.
  • the determination of the type ie the distinction between a pedestrian, a cyclist and a vehicle, is carried out by applying a classification algorithm to the recorded sensor data.
  • the fifth layer S5 is formed by a grid-like occupancy map.
  • the current positions of objects, such as vehicles, are shown as a grid R.
  • the data present in layer S4 are converted into the occupancy grid format.
  • the raw data from the sensors can also be used directly.
  • the monitored area is divided into cells and the probability that the cells are occupied is determined by the infrastructure-side evaluation units and, if necessary, also by the vehicle-side evaluation units. The quality of the determination of the occupation probability and the attributes mentioned is increased by sensor data fusion.
  • scene interpretations such as a suitable behavior model for the autonomous vehicle F
  • the infrastructural monitoring enables a better assessment of the dynamics in the area under consideration. For this purpose, information about the traffic flow of certain streets and intersections can be collected.
  • Layer S1 is static, whereas layers S4 and S5 concern highly dynamic processes.
  • the information recorded on the infrastructure side relates in particular to layers S4 and S5. Depending on the dynamics and criticality of the available data, these are passed on to the vehicle at a correspondingly high update rate. For highly dy- Named data is typically at least 15 updates per second.
  • Layer 1 data is static and only needs to be transmitted to the vehicle once a day.
  • the data of the different layers basically have different update frequencies and therefore also have different requirements with regard to real-time data acquisition and data transmission.
  • Layer S5 is used, in particular, for free space modeling with a very small amount of data and thus a high possible update rate and low latency. A vehicle can be informed in real time about a drivable area using data from layer S5.
  • FIG. 6 shows a schematic illustration 60 which illustrates the creation of a digital twin of a node or an intersection.
  • the detected objects F, M, P are shown as cubic objects.
  • This data can then be passed on to an autonomous vehicle for enrichment and expansion.
  • the transmission process can take place through a communication medium such as a roadside unit 22c (see also FIG. 2) or with the aid of 4G / 5G data transmission.
  • the flow of information can be unidirectional from infrastructure 22a, 22b, 22c, 22d to vehicle F and vice versa, unidirectional from vehicle F to infrastructure 22a, 22b,
  • 22c, 22d take place.
  • the flow of information can, however, also be bidirectional.
  • FIG. 7 shows a schematic representation of a surrounding model generating device 70.
  • the environment model generation device 70 comprises infrastructure-side sensors 22a, 22b for acquiring sensor data SD in the vicinity of the sensors.
  • the recorded sensor data are transmitted to an evaluation device 22d via a communication unit 22c.
  • the evaluation device 22d comprises a stationary fusion unit 71 for merging the sensor data SD and an evaluation unit 72 for the stationary generation of environment model data MD on the basis of the merged sensor data SD.
  • a stationary fusion unit 71 for merging the sensor data SD
  • an evaluation unit 72 for the stationary generation of environment model data MD on the basis of the merged sensor data SD.
  • the evaluation device 22d also includes a stationary communication unit 73 for transmitting the environment model data MD, possibly also via the communication unit 22c, to an autonomously controlled vehicle F.
  • FIG. 8 shows a vehicle control device 80 which has a vehicle-side communication unit 81 for receiving environment model data MD from a stationary environment model generation device 70. Part of the vehicle control device 80 is also a control unit 82 for the automated control of the driving behavior of a vehicle F on the basis of an environment model MD generated by the stationary environment model generation device 70.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Optics & Photonics (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé pour générer un modèle d'environnement d'un véhicule F à commande autonome. Lors du procédé, des données de capteurs sont acquises par une pluralité de capteurs (22a, 22b) côté infrastructure dans une zone environnante du véhicule F. Les données de capteurs fusionnées sont générées de manière fixe sur la base des données de capteur acquises. Des données du modèle d'environnement sont en outre générées de manière fixe par l'évaluation côté infrastructure des données de capteur, les objets qui se trouvent dans la zone environnante pouvant être localisés et identifiés. Pour terminer, les données du modèle d'environnement sont communiquées au véhicule F. L'invention concerne en outre un procédé de commande autonome d'un véhicule F. L'invention concerne également un dispositif de génération de modèles d'environnement et un dispositif de commande de véhicule. L'invention concerne en plus un véhicule F à commande autonome et un système de transport automatisé. Une jonction d'entrée (40) dans le réseau routier présente une zone centrale (41a) à priorité élevée et une zone périphérique (41b) à priorité plus faible. La jonction d'entrée (40) est surveillée à l'aide d'unités de capteur (22a, 22b). Deux véhicules F se trouvent dans la zone de la jonction d'entrée, et ainsi également dans la zone de surveillance (41a, 41b), lesquels s'approchent chacun de la jonction d'entrée. Une motocyclette M est en plus détectée, laquelle s'éloigne de la zone de surveillance intérieure (41a). En plus de cela, une personne P qui est en train de traverser un passage pour piétons Z est détectée dans la zone de surveillance intérieure (41a). Les données de surveillance sont transmises par une unité de communication (22c) par radiocommunication aux véhicules F qui se trouvent dans la zone de transmission. Des données de capteur de la zone environnante du véhicule, acquises au niveau du véhicule, sont de préférence également utilisées pour générer les données du modèle d'environnement. Une base de données pour le modèle d'environnement du véhicule peut avantageusement être étendue, ce qui permet d'améliorer la fiabilité, la résolution et l'exhaustivité du modèle d'environnement.
EP20725093.7A 2019-06-25 2020-04-21 Détection de l'environnement côté infrastructure lors de la conduite autonome Pending EP3928127A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019209154.7A DE102019209154A1 (de) 2019-06-25 2019-06-25 Infrastrukturseitige Umfelderfassung beim autonomen Fahren
PCT/EP2020/061065 WO2020259892A1 (fr) 2019-06-25 2020-04-21 Détection de l'environnement côté infrastructure lors de la conduite autonome

Publications (1)

Publication Number Publication Date
EP3928127A1 true EP3928127A1 (fr) 2021-12-29

Family

ID=70680458

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20725093.7A Pending EP3928127A1 (fr) 2019-06-25 2020-04-21 Détection de l'environnement côté infrastructure lors de la conduite autonome

Country Status (3)

Country Link
EP (1) EP3928127A1 (fr)
DE (1) DE102019209154A1 (fr)
WO (1) WO2020259892A1 (fr)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020211478A1 (de) 2020-09-14 2022-03-17 Robert Bosch Gesellschaft mit beschränkter Haftung Konzept zum Unterstützen eines zumindest teilautomatisiert geführten Kraftfahrzeugs
DE102020213661A1 (de) 2020-10-30 2022-05-05 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Analysieren eines Umfelds eines Kraftfahrzeugs
DE102021203994A1 (de) 2021-04-21 2022-10-27 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Betreiben eines wenigstens teilweise automatisierten Fahrzeugs
DE102021206319A1 (de) 2021-06-21 2022-12-22 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum infrastrukturgestützten Assistieren eines Kraftfahrzeugs
DE102021207997A1 (de) 2021-07-26 2023-01-26 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Überprüfen einer Vollständigkeit eines Modells einer Verkehrsdynamik an einem Verkehrsknotenpunkt
DE102021208616A1 (de) 2021-08-06 2023-02-09 Siemens Mobility GmbH Anordnung von infrastrukturseitiger Überwachungssensorik
DE102021209699A1 (de) 2021-09-03 2023-03-09 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Betreiben von mehreren Infrastruktursystemen
DE102021209680A1 (de) 2021-09-03 2023-03-09 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Verbesserung der Schätzung von Existenzwahrscheinlichkeiten
DE102021213819A1 (de) 2021-12-06 2023-06-07 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Bestimmen einer Datenübertragungsqualität einer Kommunikationsverbindung zwischen vernetzten Kraftfahrzeugen und einem Infrastruktursystem zur Fahrunterstützung von zumindest teilautomatisiert geführten vernetzten Kraftfahrzeugen
DE102021213818A1 (de) 2021-12-06 2023-06-07 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Bestimmen einer Datenübertragungsqualität eines Infrastruktursystems zur Fahrunterstützung von zumindest teilautomatisiert geführten vernetzten Kraftfahrzeugen
DE102022201280A1 (de) 2022-02-08 2023-08-10 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren und Vorrichtung zum Betreiben eines Infrastruktursensorsystems
DE102022203289A1 (de) 2022-04-01 2023-10-05 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren und Vorrichtung zum Erkennen von Fehlausrichtungen eines stationären Sensors sowie stationärer Sensor
DE102022206981A1 (de) 2022-07-08 2024-01-11 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Fahrunterstützung eines vernetzten Kraftfahrzeugs
DE102022211436A1 (de) 2022-10-28 2024-05-08 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Erstellung eines Fahrzeugmodells für ein Kraftfahrzeug, insbesondere für einen LKW
DE102023108153A1 (de) 2023-03-30 2024-10-02 Daimler Truck AG Verfahren zur Lokalisierung einer Schallquelle

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130127822A (ko) * 2012-05-15 2013-11-25 한국전자통신연구원 도로상 물체 분류 및 위치검출을 위한 이종 센서 융합처리 장치 및 방법
DE102016205139B4 (de) * 2015-09-29 2022-10-27 Volkswagen Aktiengesellschaft Vorrichtung und Verfahren zur Charakterisierung von Objekten
DE102016214470B4 (de) * 2016-08-04 2023-06-22 Volkswagen Aktiengesellschaft Verfahren und System zum Erfassen eines Verkehrsumfeldes einer mobilen Einheit
US10692365B2 (en) * 2017-06-20 2020-06-23 Cavh Llc Intelligent road infrastructure system (IRIS): systems and methods
DE102017212533A1 (de) * 2017-07-21 2019-01-24 Robert Bosch Gmbh Vorrichtung und Verfahren zum Bereitstellen einer Zustandsinformation eines automatischen Valet Parking Systems

Also Published As

Publication number Publication date
WO2020259892A1 (fr) 2020-12-30
DE102019209154A1 (de) 2020-12-31

Similar Documents

Publication Publication Date Title
WO2020259892A1 (fr) Détection de l'environnement côté infrastructure lors de la conduite autonome
EP3688742B1 (fr) Système pour produire et/ou actualiser un modèle numérique d'une carte numérique
EP3572293B1 (fr) Procédé d'aide à la conduite d'au moins un véhicule automobile et système d'assistance
EP3363005B1 (fr) Procédé de détermination et de mise à disposition d'une banque de données liée à un environnement prédéfini, contenant des données d'environnement
DE102013107959B4 (de) Verfahren zur Unterstützung des Parkens von Fahrzeugen auf einer Parkfläche, Parkflächenmanagementsystem, fahrzeugseitiges System und Computerprogramm dafür
DE102020111682A1 (de) Systeme und verfahren zum implementieren einer autonomen fahrzeugreaktion auf ein sensorversagen
DE112020004587T5 (de) Verteilter verkehrssicherheitskonsens
DE102017217443B4 (de) Verfahren und System zur Bereitstellung von Trainingsdaten zum maschinellen Lernen für ein Steuerungsmodell einer automatischen Fahrzeugsteuerung
DE102018129048A1 (de) Systeme und verfahren zum bestimmen von sicherheitsereignissen für ein autonomes fahrzeug
DE112017003968T5 (de) Steuerung eines autonomen fahrzeugs
DE102015209947A1 (de) Evakuierungsfahrtassistenzvorrichtung
DE102014220681A1 (de) Verkehrssignalvorhersage
DE102014220678A1 (de) Fahrzeug-fernüberwachung
DE102016212700A1 (de) Verfahren und System zur Steuerung eines Fahrzeugs
EP2936470B1 (fr) Procédé et système pour apprendre d'événements de trafic ainsi que l'utilisation dudit système
EP2858039A1 (fr) Procédé de contrôle automatique de l'arrivée d'un véhicule routier dans une section de route contrôlée, système de contrôle et son système côté véhicule et programme informatique
WO2017102623A1 (fr) Procédé et dispositif de prédiction du déplacement d'un usager de la route dans un zone de trafic
DE102019209552A1 (de) Verfahren zur Verkehrserfassung
DE102018124578A1 (de) System und verfahren zur dynamischen fahrzeuganpassung und zum -tuning
DE112019004772T5 (de) System und Verfahren zum Bereitstellen von unterstützenden Aktionen zur gemeinsamen Straßenbenutzung
DE102013107960B4 (de) Verfahren zur Aktualisierung einer Datenbasis sowie Einrichtung und Computerprogramm
DE112019000969T5 (de) Informationsverarbeitungssystem und Informationsverarbeitungsverfahren
WO2018010875A1 (fr) Procédé et dispositif de création d'une carte de danger pour identifier au moins un emplacement dangereux pour véhicule
DE102019117136A1 (de) Kommunikation und steuerung für verkehrsinfrastruktur
DE102019116962A1 (de) Transportinfrastrukturkommunikation und -steuerung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210923

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ROBERT BOSCH GMBH

17Q First examination report despatched

Effective date: 20231222