EP3928127A1 - Infrastructure-side surroundings acquisition in autonomous driving - Google Patents
Infrastructure-side surroundings acquisition in autonomous drivingInfo
- Publication number
- EP3928127A1 EP3928127A1 EP20725093.7A EP20725093A EP3928127A1 EP 3928127 A1 EP3928127 A1 EP 3928127A1 EP 20725093 A EP20725093 A EP 20725093A EP 3928127 A1 EP3928127 A1 EP 3928127A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- vehicle
- environment model
- data
- sensor data
- sensors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 35
- 230000006854 communication Effects 0.000 claims abstract description 27
- 238000004891 communication Methods 0.000 claims abstract description 27
- 238000011156 evaluation Methods 0.000 claims abstract description 14
- 230000005540 biological transmission Effects 0.000 claims abstract description 7
- 238000001514 detection method Methods 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 9
- 230000004927 fusion Effects 0.000 claims description 8
- 238000012544 monitoring process Methods 0.000 abstract description 14
- 241000283070 Equus zebra Species 0.000 abstract description 2
- 230000002093 peripheral effect Effects 0.000 abstract description 2
- 238000012545 processing Methods 0.000 description 5
- 230000004807 localization Effects 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000013459 approach Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/91—Radar or analogous systems specially adapted for specific applications for traffic control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/003—Transmission of data between radar, sonar or lidar systems and remote stations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4804—Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9316—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
Definitions
- the invention relates to a method for generating a field model of an autonomously controlled vehicle.
- the invention also relates to a method for the autonomous control of a vehicle.
- the invention also relates to an environment model generation device.
- the invention also relates to a vehicle control device.
- the invention also relates to an autonomously controlled vehicle.
- the invention also relates to an automated transport system.
- Autonomous driving is understood to be the independent driving, control and parking of a vehicle without human influence.
- Autonomous driving requires precise knowledge of the position and speed both of the autonomously controlled vehicle itself and of objects located in the vicinity of a route of an autonomously controlled vehicle. Therefore, in the context of autonomous vehicles, the perception of the surroundings and the localization and detection of objects as well as the autonomous vehicle itself are of outstanding importance. If the surroundings are only detected by the autonomous vehicle itself, this detection is limited to the detection sensors of the autonomous vehicle.
- the sensors used in vehicles, such as lidar, radar, ultrasonic sensors and cameras have limitations in terms of their range, real-time capability, coverage of the surrounding area and performance, especially in different weather conditions, such as strong sunlight, rain and
- shading areas caused by trucks or cars can occur during the vehicle-side environment recognition, which cannot be perceived by one's own environment detection.
- major decisions when detecting objects, ie an object is detected by a majority of the available sensors, but not by a minority. It is then accepted that an object has been detected. It would be desirable to have the To increase the number of sensors contributing to such a decision beyond the number of sensors directly available to an autonomous vehicle. Infrastructural sensors could then also be consulted for such a decision two sensors of the vehicle detect an object but two sensors of two existing sensors on the side of the infrastructure detect the object, then it would also be accepted that an object was detected.
- Some situations also require a "look around the corner", for example driving on an X-junction or T-junction. Difficult situations, such as entrances and exits from underpasses or bridges, turning maneuvers, overtaking maneuvers, etc., cannot be safely dealt with by sensors on the vehicle either.
- Another problem is that an ever more complex sensor system and an increasingly complex environment modeling algorithm are used in a powerful environment detection. This is associated with immense costs for the individual vehicles. For example, up to 12 lidar devices are currently installed, the unit price of which is between 6,000 and 65,000 euros.
- the speed of the autonomous vehicle is greatly reduced in order to ensure the safety of the vehicle, or emergency braking occurs, as a result of which occupants in autonomous vehicles are endangered. Attempts should therefore be made to avoid such emergency braking.
- a safety driver is required, who forms a fallback level of the system.
- the object is therefore to provide a method and a device for improved environment recognition and localization of autonomous vehicles.
- This object is achieved by a method for generating an environment model of an autonomously controlled vehicle according to claim 1, a method for autonomous control of a vehicle according to claim 10, an environment model generation device according to claim 11, a vehicle control device according to claim 12, an autonomously controlled vehicle according to claim 13 and an automated transport system according to claim 14 solved.
- sensor data are recorded by a plurality of infrastructure-side sensors in a surrounding area of the vehicle.
- the merged sensor data are generated on the basis of the recorded sensor data.
- environment model data are generated in a stationary manner by evaluating the sensor data on the infrastructure side, objects located in the environment area being localized and be identified. Finally, the environment model data are transmitted to the vehicle.
- the sensors on the infrastructure side work on the principle of geofencing. If an autonomously controlled vehicle enters a communication area of the infrastructure-side monitoring, communication takes place between the infrastructure and the autonomous vehicle. As part of this communication, the vehicle receives information about the permanently defined surrounding area monitored by the sensors and objects that are present or moving therein. An environment model of the autonomous vehicle can be created or supplemented on the basis of this information. If the autonomous vehicle drives into the detection area of the infrastructure-side sensors, it is detected and this information is integrated into the environment model of the area monitored by the sensors. If the autonomously controlled vehicle leaves the communication area, communication between the infrastructure-side monitoring units and the relevant autonomously controlled vehicle ends. The communication area and the monitoring area do not have to, but can also be identical.
- an environment model is generated by using the method according to the invention for generating an environment model of an autonomously controlled vehicle.
- the environment model can be transmitted statically for a defined so-called traffic cell in which the autonomous vehicle is currently located.
- the current vehicle position can be used to dynamically track the transmitted environment model section.
- the environment model generating device has infrastructure-side sensors for acquiring sensor data in the area surrounding the sensors.
- Part of the environment model generation device according to the invention is also a stationary fusion unit for merging the sensor data.
- the environment model generation device according to the invention also has an evaluation unit for the stationary generation of environment model data on the basis of the merged sensor data. Objects located in the surrounding area are localized and identified.
- Part of the environment model generation device according to the invention is also a stationary communication unit for transmitting the environment model data to an autonomously controlled vehicle.
- the environmental mo- Dent generation device shares the advantages of the method according to the invention for generating an environment model of an autonomously controlled vehicle.
- the vehicle control device has a vehicle-side communication unit for receiving field model data from the stationary environment model generation device. Part of the vehicle control device according to the invention is also a control unit for the automated control of the driving behavior of a vehicle on the basis of the environment model generated by the stationary environment model generation device according to the invention.
- the vehicle control device shares the advantages of the method according to the invention for autonomous control of a vehicle.
- the autonomously controlled vehicle according to the invention comprises the vehicle control device according to the invention.
- the autonomously controlled vehicle according to the invention shares the advantages of the vehicle control device according to the invention.
- the automated transport system according to the invention has a surrounding model generating device arranged on the infrastructure side according to the invention and at least one autonomously controlled vehicle according to the invention.
- An object localization can advantageously be supplemented by the infrastructure or at least partially outsourced to it, so that the performance of the object localization and object recognition of an autonomously controlled vehicle is improved.
- Some components of the environment model generation device according to the invention, the vehicle control device according to the invention, the autonomously controlled vehicle according to the invention and the automated transport system according to the invention can for the most part be designed in the form of software components. This applies in particular to parts of the stationary fusion unit, the evaluation unit and the control unit. Basically, these components can but also in part, especially when it comes to particularly fast calculations, in the form of software-supported hardware, for example FPGAs or the like. Likewise, the required interfaces, for example if it is only a matter of transferring data from other software components, can be designed as software interfaces. But they can also be configured as hardware-based
- Interfaces can be formed that are controlled by suitable software.
- a largely software-based implementation has the advantage that computer systems already present in a mobile object or in infrastructure can easily be retrofitted with a software update after a possible addition by additional hardware elements, such as sensors and communication units. to work in the manner of the invention.
- the object is also achieved by a corresponding computer program product with a computer program that can be loaded directly into a memory device of such a computer system, with program sections to carry out the steps of the method according to the invention that can be implemented by software when the computer program is executed in the computer system.
- such a computer program product can optionally contain additional components such as documentation and / or additional components, including hardware components, such as Hardware keys (dongles etc.) for using the software.
- additional components such as documentation and / or additional components, including hardware components, such as Hardware keys (dongles etc.) for using the software.
- a computer-readable medium for example a memory stick, a hard disk or some other transportable or permanently installed data carrier on which the program sections of the computer program that can be read and executed by a computer unit can be used for transport to the memory device of the computer system and / or for storage on the computer system are stored.
- the computing unit can, for example, have one or more cooperating microprocessors or the like for this purpose.
- the sensors include radar sensors and / or lidar sensors.
- radar waves preferably have wavelengths of a few millimeters to centimeters. Radar offers complete volume illumination without blind spots, but with a comparable sensor size, a lower selectivity is achieved than with lidar. But radar is robust in almost all weather conditions, such as rain, snow, fog, darkness or direct sunlight.
- Lidar has the property of being able to display object edges more precisely than radar. However, Lidar has a very limited field of vision and is also more susceptible to adverse weather conditions.
- the two detection technologies can also be advantageously combined in order to combine the various advantages of the different technologies described with one another and to compensate for the disadvantages described.
- the sensors can also have cameras. With cameras, an extensive area can be recorded and monitored simultaneously with good resolution.
- a combination of radar and / or lidar with cameras as sensors can also be used advantageously.
- the environment model data are preferably generated object-based, with attributes being assigned to individual objects.
- the at tributes include information on relevant properties of the detected objects. These properties enable a hazard assessment in connection with a detected object and they allow the behavior of a detected object to be predicted to a certain extent.
- the named attributes of an object can include at least one of the following attribute types:
- Further attributes can include time stamps, an identification number, the length, width and height of a vehicle or object, contours, parameters close to raw data, such as the number of reflection points or a security level or reliability level of transmitted data.
- the reliability of the data can depend, for example, on whether they were estimated or otherwise determined.
- the environment model data include a grid-based occupancy map.
- a grid-based occupancy map also referred to as an "occupancy grid map” divides the monitored area into a plurality of cells, the occupancy probability of which is determined on the basis of sensor data.
- An object can be identified on the basis of the position and number of occupied cells in the grid or grid of the grid-based occupancy map, and a detection probability value of an object can be specified on the basis of the probability of the cells being occupied.
- the surrounding area is divided into sub-areas with different priorities.
- Areas with different priorities can, for example, have different requirements for real-time capabilities of sensor detection, different security requirements and system requirements derived from them.
- a high priority may require the use of different sensors with different performance in different weather conditions, a high level of redundancy in the detection of objects and the processing of sensor data, high real-time requirements and a high level of accuracy and resolution in the area detection.
- a significant proportion of the processing of the sensor data is preferably carried out in each sensor itself, taking into account the specific sensor properties. No further prior knowledge is therefore required to generate the occupancy card. This reduces the amount of data collected by all sensors and the computing load for creating the occupancy card. However, this can be at the expense of the accuracy of the result, since information can be lost during the sensor-internal processing, which information can be helpful for a possible data fusion when creating the occupancy card.
- the grid-based occupancy map is preferably generated on the basis of the object-based environment model data.
- the amount of data for generating the grid-based occupancy rate is relatively small, so that the grid-based occupancy map can be created very quickly.
- the grid-based occupancy card is particularly preferably generated directly on the basis of the sensor data.
- the captured data can be processed particularly precisely, since no intermediate steps can lead to a loss of information.
- the processing takes place here on the basis of sensor data, which are composed in such a way that no information that is helpful for increasing the accuracy of the data fusion is lost due to any preprocessing.
- the amount of data is not necessarily increased in this case, however, certain sensor properties or prior knowledge of the procurement unit of the sensor, such as the resolution, the measurement accuracy, etc., must be known and taken into account during the fusion.
- sensor data recorded on the vehicle side from the area surrounding the vehicle are also used to generate the environment model data.
- a database for the environment model of the vehicle can advantageously be expanded, so that the reliability, resolution and completeness of the environment model is improved.
- FIG. 1 shows a flow diagram which illustrates a method for generating an environment model of an autonomously controlled vehicle according to an exemplary embodiment of the invention
- FIG. 2 shows a schematic representation of an automated transport system
- 3 shows a schematic representation of a stationary monitored straight route with areas with different priorities
- FIG. 5 shows a schematic representation of a layer model of determined environment data
- FIG. 6 shows a schematic representation which illustrates the creation of a digital twin of a node or an intersection
- FIG. 7 shows an environment model generating device according to an exemplary embodiment of the invention
- FIG. 8 shows a schematic representation of a vehicle control device.
- FIG. 1 shows a flow diagram 100 which illustrates a method for generating an environment model MD of an autonomously controlled vehicle F according to an exemplary embodiment of the invention.
- step 1. I sensor data SD are recorded by a plurality of infrastructure-side sensors in an area surrounding the vehicle F.
- the sensors used to acquire the sensor data can include, for example, lidar sensors, radar sensors and cameras.
- the sensors are arranged on sensor masts at regular intervals along a driveway or around an intersection or junction and each monitor a closer environment.
- step 1.II the sensor data SD from different sensors are merged, so that merged sensor data FSD are generated on the basis of the sensed sensor data.
- step 1.III an environment model MD of a street section, for example an intersection or a straight line, becomes stationary Section, based on the merged sensor data. In the context of the environment model, rich objects located in the environment are localized and identified.
- the environment model data MD are also transmitted to the vehicle F.
- the vehicle F enriches its own environment model MDF with the stationary environment model MD, ie it supplements its environment model MDF with the environment model data MD obtained, which were recorded and generated in a stationary manner.
- FIG. 2 shows a schematic representation 20 of an arrangement made up of a vehicle-side and an infrastructure-side sensor system.
- the infrastructure 23 comprises sensor units 22a, 22b with sensors A2, with which a road section 21 is monitored on which a vehicle F is on the way.
- Typical sensors are, for example, lidar sensors, radar sensors and optical sensors, such as cameras, with which data from the environment of the infrastructure is recorded.
- the recorded data is also evaluated, checked for plausibility and exchanged.
- a defined area is monitored on the infrastructure side by the devices 22a, 22b mentioned, which is also referred to as GeoFence. This area is divided into a core area with high priority and an area with low priority. The monitoring of the different areas mentioned meets different requirements in terms of latency, detection probability, etc.
- the sensor units 22a, 22b each have communication antennas A1, with which they can communicate with other units, such as an evaluation device 22d or with the vehicle F, with the aid of a communication unit 22c on the roadside.
- the vehicle F also has a sensor unit A2 with which it can monitor its surroundings.
- the vehicle F also an antenna A1, with which it can communicate with the sensor units 22a, 22b or the evaluation device 22d, if necessary via the infrastructure-side communication unit 22c.
- an autonomous vehicle F has an on-board unit (not shown) for communication, with which it can communicate via the antenna A1 with an infrastructure-side communication unit 22c.
- the infrastructure-side communication unit 22c transmits its information periodically by radio. For example, the IEEE 802.11 standard or car-to-X can be used for transmission.
- the vehicle F receives information relating to an environment model of the infrastructure.
- the information on the surroundings can be transmitted, for example, to all road users in the catchment area of the infrastructure-side communication unit 22c or the traffic cell assigned to it. If an environment model is to be dynamically adapted for individual vehicles, identification can take place via bidirectional communication.
- FIG. 3 shows an example of a straight stretch 30 which is divided into a core area 31a with high priority and an outer area 31b with low priority.
- the core area 31a is formed by the driving area of the route 30 for vehicles F and has particularly high safety requirements. These are met by an increased number of different sensors 22a, 22b that monitor this core area rich.
- the different sensors complement each other through their properties in terms of detection accuracy and availability in different weather conditions and have redundancy in the acquisition and processing of the monitoring data.
- the real-time requirements as well as the accuracy and resolution of the environment detection in the core area are significantly higher than in the Periphery.
- the required sensors and evaluation devices are selected and arranged accordingly.
- a junction 40 is shown in the road network. This also has a core area 41a with high priority and a peripheral area 41b with lower priority. The junction 40 is with the help of three sensor units 22a,
- monitoring data are transmitted from a communication unit 22c by radio to vehicles F in the transmission range.
- FIG. 5 shows a layer model 50 for an environment model with five layers.
- Four-layer models are already known as “local dynamic map”.
- the conventional four-layer model is supplemented by an additional layer S5 can be transferred to a vehicle as a compact representation of a surrounding situation.
- the lowest layer S1 comprises so-called permanent static data, such as map data.
- the second lowest layer S2 comprises transient static data, such as the roadside infrastructure. This includes street signs, landmarks, etc.
- the third layer S3 comprises the transient dynamic data. These include, for example, traffic data and data about the signal phase or data about the current weather-related road conditions.
- the fourth layer S4 comprises highly dynamic data, such as vehicles and pedestrians.
- the traffic participants detected by the sensors such as pedestrians P, cyclists, vehicles F and the associated estimated attributes, such as their position. position, length, speed, detection probability and type.
- the determination of the type ie the distinction between a pedestrian, a cyclist and a vehicle, is carried out by applying a classification algorithm to the recorded sensor data.
- the fifth layer S5 is formed by a grid-like occupancy map.
- the current positions of objects, such as vehicles, are shown as a grid R.
- the data present in layer S4 are converted into the occupancy grid format.
- the raw data from the sensors can also be used directly.
- the monitored area is divided into cells and the probability that the cells are occupied is determined by the infrastructure-side evaluation units and, if necessary, also by the vehicle-side evaluation units. The quality of the determination of the occupation probability and the attributes mentioned is increased by sensor data fusion.
- scene interpretations such as a suitable behavior model for the autonomous vehicle F
- the infrastructural monitoring enables a better assessment of the dynamics in the area under consideration. For this purpose, information about the traffic flow of certain streets and intersections can be collected.
- Layer S1 is static, whereas layers S4 and S5 concern highly dynamic processes.
- the information recorded on the infrastructure side relates in particular to layers S4 and S5. Depending on the dynamics and criticality of the available data, these are passed on to the vehicle at a correspondingly high update rate. For highly dy- Named data is typically at least 15 updates per second.
- Layer 1 data is static and only needs to be transmitted to the vehicle once a day.
- the data of the different layers basically have different update frequencies and therefore also have different requirements with regard to real-time data acquisition and data transmission.
- Layer S5 is used, in particular, for free space modeling with a very small amount of data and thus a high possible update rate and low latency. A vehicle can be informed in real time about a drivable area using data from layer S5.
- FIG. 6 shows a schematic illustration 60 which illustrates the creation of a digital twin of a node or an intersection.
- the detected objects F, M, P are shown as cubic objects.
- This data can then be passed on to an autonomous vehicle for enrichment and expansion.
- the transmission process can take place through a communication medium such as a roadside unit 22c (see also FIG. 2) or with the aid of 4G / 5G data transmission.
- the flow of information can be unidirectional from infrastructure 22a, 22b, 22c, 22d to vehicle F and vice versa, unidirectional from vehicle F to infrastructure 22a, 22b,
- 22c, 22d take place.
- the flow of information can, however, also be bidirectional.
- FIG. 7 shows a schematic representation of a surrounding model generating device 70.
- the environment model generation device 70 comprises infrastructure-side sensors 22a, 22b for acquiring sensor data SD in the vicinity of the sensors.
- the recorded sensor data are transmitted to an evaluation device 22d via a communication unit 22c.
- the evaluation device 22d comprises a stationary fusion unit 71 for merging the sensor data SD and an evaluation unit 72 for the stationary generation of environment model data MD on the basis of the merged sensor data SD.
- a stationary fusion unit 71 for merging the sensor data SD
- an evaluation unit 72 for the stationary generation of environment model data MD on the basis of the merged sensor data SD.
- the evaluation device 22d also includes a stationary communication unit 73 for transmitting the environment model data MD, possibly also via the communication unit 22c, to an autonomously controlled vehicle F.
- FIG. 8 shows a vehicle control device 80 which has a vehicle-side communication unit 81 for receiving environment model data MD from a stationary environment model generation device 70. Part of the vehicle control device 80 is also a control unit 82 for the automated control of the driving behavior of a vehicle F on the basis of an environment model MD generated by the stationary environment model generation device 70.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Optics & Photonics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019209154.7A DE102019209154A1 (en) | 2019-06-25 | 2019-06-25 | Infrastructure detection of the surroundings in autonomous driving |
PCT/EP2020/061065 WO2020259892A1 (en) | 2019-06-25 | 2020-04-21 | Infrastructure-side surroundings acquisition in autonomous driving |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3928127A1 true EP3928127A1 (en) | 2021-12-29 |
Family
ID=70680458
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20725093.7A Pending EP3928127A1 (en) | 2019-06-25 | 2020-04-21 | Infrastructure-side surroundings acquisition in autonomous driving |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3928127A1 (en) |
DE (1) | DE102019209154A1 (en) |
WO (1) | WO2020259892A1 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102020211478A1 (en) | 2020-09-14 | 2022-03-17 | Robert Bosch Gesellschaft mit beschränkter Haftung | Concept for supporting an at least partially automated motor vehicle |
DE102020213661A1 (en) | 2020-10-30 | 2022-05-05 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for analyzing the surroundings of a motor vehicle |
DE102021203994A1 (en) | 2021-04-21 | 2022-10-27 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for operating an at least partially automated vehicle |
DE102021206319A1 (en) | 2021-06-21 | 2022-12-22 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for infrastructure-supported assistance in a motor vehicle |
DE102021207997A1 (en) | 2021-07-26 | 2023-01-26 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for checking completeness of a model of traffic dynamics at a traffic junction |
DE102021208616A1 (en) | 2021-08-06 | 2023-02-09 | Siemens Mobility GmbH | Arrangement of infrastructure-side monitoring sensors |
DE102021209699A1 (en) | 2021-09-03 | 2023-03-09 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for operating multiple infrastructure systems |
DE102021209680A1 (en) | 2021-09-03 | 2023-03-09 | Robert Bosch Gesellschaft mit beschränkter Haftung | Procedure for improving the estimation of existence probabilities |
DE102021213819A1 (en) | 2021-12-06 | 2023-06-07 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for determining a data transmission quality of a communication connection between networked motor vehicles and an infrastructure system for driving support of at least partially automated networked motor vehicles |
DE102021213818A1 (en) | 2021-12-06 | 2023-06-07 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for determining a data transmission quality of an infrastructure system for driving support of networked motor vehicles that are at least partially automated |
DE102022201280A1 (en) | 2022-02-08 | 2023-08-10 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method and device for operating an infrastructure sensor system |
DE102022203289A1 (en) | 2022-04-01 | 2023-10-05 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method and device for detecting misalignments of a stationary sensor and stationary sensor |
DE102022206981A1 (en) | 2022-07-08 | 2024-01-11 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for driving support for a networked motor vehicle |
DE102022211436A1 (en) | 2022-10-28 | 2024-05-08 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for creating a vehicle model for a motor vehicle, in particular for a truck |
DE102023108153A1 (en) | 2023-03-30 | 2024-10-02 | Daimler Truck AG | Method for localizing a sound source |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130127822A (en) * | 2012-05-15 | 2013-11-25 | 한국전자통신연구원 | Apparatus and method of processing heterogeneous sensor fusion for classifying and positioning object on road |
DE102016205139B4 (en) * | 2015-09-29 | 2022-10-27 | Volkswagen Aktiengesellschaft | Device and method for characterizing objects |
DE102016214470B4 (en) * | 2016-08-04 | 2023-06-22 | Volkswagen Aktiengesellschaft | Method and system for capturing a traffic environment of a mobile unit |
US10692365B2 (en) * | 2017-06-20 | 2020-06-23 | Cavh Llc | Intelligent road infrastructure system (IRIS): systems and methods |
DE102017212533A1 (en) * | 2017-07-21 | 2019-01-24 | Robert Bosch Gmbh | Device and method for providing state information of an automatic valet parking system |
-
2019
- 2019-06-25 DE DE102019209154.7A patent/DE102019209154A1/en not_active Withdrawn
-
2020
- 2020-04-21 EP EP20725093.7A patent/EP3928127A1/en active Pending
- 2020-04-21 WO PCT/EP2020/061065 patent/WO2020259892A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
WO2020259892A1 (en) | 2020-12-30 |
DE102019209154A1 (en) | 2020-12-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020259892A1 (en) | Infrastructure-side surroundings acquisition in autonomous driving | |
EP3688742B1 (en) | System for producing and/or updating a digital model of a digital map | |
EP3572293B1 (en) | Method for assisting driving of at least one motor vehicle and assistance system | |
EP3363005B1 (en) | Method for ascertaining and providing a database which relates to a specified surrounding area and contains environmental data | |
DE102013107959B4 (en) | A method of assisting the parking of vehicles on a parking area, parking area management system, on-vehicle system and computer program therefor | |
DE102020111682A1 (en) | SYSTEMS AND METHODS FOR IMPLEMENTING AN AUTONOMOUS VEHICLE REACTION TO A SENSOR FAILURE | |
DE112020004587T5 (en) | DISTRIBUTED ROAD SAFETY CONSENSATION | |
DE102017217443B4 (en) | Method and system for providing training data for machine learning for a control model of an automatic vehicle control | |
DE102018129048A1 (en) | SYSTEMS AND METHOD FOR DETERMINING SAFETY EVENTS FOR AN AUTONOMOUS VEHICLE | |
DE112017003968T5 (en) | CONTROL OF AN AUTONOMOUS VEHICLE | |
DE102015209947A1 (en) | Evacuation travel assistance device | |
DE102014220681A1 (en) | Traffic signal prediction | |
DE102014220678A1 (en) | VEHICLE REMOTE MONITORING | |
DE102016212700A1 (en) | Method and system for controlling a vehicle | |
EP2936470B1 (en) | Method and system for learning traffic events, and use of the system | |
EP2858039A1 (en) | Method for automatically control of the entry of a road vehicle into a controlled stretch of road, control system and on-board system for the same, and computer program | |
WO2017102623A1 (en) | Method and device for predicting a movement of a road traffic participant in a traffic space | |
DE102019209552A1 (en) | Procedure for traffic registration | |
DE102018124578A1 (en) | SYSTEM AND METHOD FOR DYNAMIC VEHICLE ADJUSTMENT AND TUNING | |
DE112019004772T5 (en) | System and method for providing road sharing support actions | |
DE102013107960B4 (en) | Method for updating a database as well as device and computer program | |
DE112019000969T5 (en) | Information processing system and information processing method | |
WO2018010875A1 (en) | Method and apparatus for creating a hazard map for identifying at least one hazardous location for a vehicle | |
DE102019117136A1 (en) | COMMUNICATION AND CONTROL FOR TRAFFIC INFRASTRUCTURE | |
DE102019116962A1 (en) | TRANSPORT INFRASTRUCTURE COMMUNICATION AND CONTROL |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210923 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ROBERT BOSCH GMBH |
|
17Q | First examination report despatched |
Effective date: 20231222 |