WO2020259892A1 - Détection de l'environnement côté infrastructure lors de la conduite autonome - Google Patents
Détection de l'environnement côté infrastructure lors de la conduite autonome Download PDFInfo
- Publication number
- WO2020259892A1 WO2020259892A1 PCT/EP2020/061065 EP2020061065W WO2020259892A1 WO 2020259892 A1 WO2020259892 A1 WO 2020259892A1 EP 2020061065 W EP2020061065 W EP 2020061065W WO 2020259892 A1 WO2020259892 A1 WO 2020259892A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- environment model
- data
- sensor data
- sensors
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/91—Radar or analogous systems specially adapted for specific applications for traffic control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/003—Transmission of data between radar, sonar or lidar systems and remote stations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4804—Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0141—Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9316—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
Definitions
- the invention relates to a method for generating a field model of an autonomously controlled vehicle.
- the invention also relates to a method for the autonomous control of a vehicle.
- the invention also relates to an environment model generation device.
- the invention also relates to a vehicle control device.
- the invention also relates to an autonomously controlled vehicle.
- the invention also relates to an automated transport system.
- Autonomous driving is understood to be the independent driving, control and parking of a vehicle without human influence.
- Autonomous driving requires precise knowledge of the position and speed both of the autonomously controlled vehicle itself and of objects located in the vicinity of a route of an autonomously controlled vehicle. Therefore, in the context of autonomous vehicles, the perception of the surroundings and the localization and detection of objects as well as the autonomous vehicle itself are of outstanding importance. If the surroundings are only detected by the autonomous vehicle itself, this detection is limited to the detection sensors of the autonomous vehicle.
- the sensors used in vehicles, such as lidar, radar, ultrasonic sensors and cameras have limitations in terms of their range, real-time capability, coverage of the surrounding area and performance, especially in different weather conditions, such as strong sunlight, rain and
- shading areas caused by trucks or cars can occur during the vehicle-side environment recognition, which cannot be perceived by one's own environment detection.
- major decisions when detecting objects, ie an object is detected by a majority of the available sensors, but not by a minority. It is then accepted that an object has been detected. It would be desirable to have the To increase the number of sensors contributing to such a decision beyond the number of sensors directly available to an autonomous vehicle. Infrastructural sensors could then also be consulted for such a decision two sensors of the vehicle detect an object but two sensors of two existing sensors on the side of the infrastructure detect the object, then it would also be accepted that an object was detected.
- Some situations also require a "look around the corner", for example driving on an X-junction or T-junction. Difficult situations, such as entrances and exits from underpasses or bridges, turning maneuvers, overtaking maneuvers, etc., cannot be safely dealt with by sensors on the vehicle either.
- Another problem is that an ever more complex sensor system and an increasingly complex environment modeling algorithm are used in a powerful environment detection. This is associated with immense costs for the individual vehicles. For example, up to 12 lidar devices are currently installed, the unit price of which is between 6,000 and 65,000 euros.
- the speed of the autonomous vehicle is greatly reduced in order to ensure the safety of the vehicle, or emergency braking occurs, as a result of which occupants in autonomous vehicles are endangered. Attempts should therefore be made to avoid such emergency braking.
- a safety driver is required, who forms a fallback level of the system.
- the object is therefore to provide a method and a device for improved environment recognition and localization of autonomous vehicles.
- This object is achieved by a method for generating an environment model of an autonomously controlled vehicle according to claim 1, a method for autonomous control of a vehicle according to claim 10, an environment model generation device according to claim 11, a vehicle control device according to claim 12, an autonomously controlled vehicle according to claim 13 and an automated transport system according to claim 14 solved.
- sensor data are recorded by a plurality of infrastructure-side sensors in a surrounding area of the vehicle.
- the merged sensor data are generated on the basis of the recorded sensor data.
- environment model data are generated in a stationary manner by evaluating the sensor data on the infrastructure side, objects located in the environment area being localized and be identified. Finally, the environment model data are transmitted to the vehicle.
- the sensors on the infrastructure side work on the principle of geofencing. If an autonomously controlled vehicle enters a communication area of the infrastructure-side monitoring, communication takes place between the infrastructure and the autonomous vehicle. As part of this communication, the vehicle receives information about the permanently defined surrounding area monitored by the sensors and objects that are present or moving therein. An environment model of the autonomous vehicle can be created or supplemented on the basis of this information. If the autonomous vehicle drives into the detection area of the infrastructure-side sensors, it is detected and this information is integrated into the environment model of the area monitored by the sensors. If the autonomously controlled vehicle leaves the communication area, communication between the infrastructure-side monitoring units and the relevant autonomously controlled vehicle ends. The communication area and the monitoring area do not have to, but can also be identical.
- an environment model is generated by using the method according to the invention for generating an environment model of an autonomously controlled vehicle.
- the environment model can be transmitted statically for a defined so-called traffic cell in which the autonomous vehicle is currently located.
- the current vehicle position can be used to dynamically track the transmitted environment model section.
- the environment model generating device has infrastructure-side sensors for acquiring sensor data in the area surrounding the sensors.
- Part of the environment model generation device according to the invention is also a stationary fusion unit for merging the sensor data.
- the environment model generation device according to the invention also has an evaluation unit for the stationary generation of environment model data on the basis of the merged sensor data. Objects located in the surrounding area are localized and identified.
- Part of the environment model generation device according to the invention is also a stationary communication unit for transmitting the environment model data to an autonomously controlled vehicle.
- the environmental mo- Dent generation device shares the advantages of the method according to the invention for generating an environment model of an autonomously controlled vehicle.
- the vehicle control device has a vehicle-side communication unit for receiving field model data from the stationary environment model generation device. Part of the vehicle control device according to the invention is also a control unit for the automated control of the driving behavior of a vehicle on the basis of the environment model generated by the stationary environment model generation device according to the invention.
- the vehicle control device shares the advantages of the method according to the invention for autonomous control of a vehicle.
- the autonomously controlled vehicle according to the invention comprises the vehicle control device according to the invention.
- the autonomously controlled vehicle according to the invention shares the advantages of the vehicle control device according to the invention.
- the automated transport system according to the invention has a surrounding model generating device arranged on the infrastructure side according to the invention and at least one autonomously controlled vehicle according to the invention.
- An object localization can advantageously be supplemented by the infrastructure or at least partially outsourced to it, so that the performance of the object localization and object recognition of an autonomously controlled vehicle is improved.
- Some components of the environment model generation device according to the invention, the vehicle control device according to the invention, the autonomously controlled vehicle according to the invention and the automated transport system according to the invention can for the most part be designed in the form of software components. This applies in particular to parts of the stationary fusion unit, the evaluation unit and the control unit. Basically, these components can but also in part, especially when it comes to particularly fast calculations, in the form of software-supported hardware, for example FPGAs or the like. Likewise, the required interfaces, for example if it is only a matter of transferring data from other software components, can be designed as software interfaces. But they can also be configured as hardware-based
- Interfaces can be formed that are controlled by suitable software.
- a largely software-based implementation has the advantage that computer systems already present in a mobile object or in infrastructure can easily be retrofitted with a software update after a possible addition by additional hardware elements, such as sensors and communication units. to work in the manner of the invention.
- the object is also achieved by a corresponding computer program product with a computer program that can be loaded directly into a memory device of such a computer system, with program sections to carry out the steps of the method according to the invention that can be implemented by software when the computer program is executed in the computer system.
- such a computer program product can optionally contain additional components such as documentation and / or additional components, including hardware components, such as Hardware keys (dongles etc.) for using the software.
- additional components such as documentation and / or additional components, including hardware components, such as Hardware keys (dongles etc.) for using the software.
- a computer-readable medium for example a memory stick, a hard disk or some other transportable or permanently installed data carrier on which the program sections of the computer program that can be read and executed by a computer unit can be used for transport to the memory device of the computer system and / or for storage on the computer system are stored.
- the computing unit can, for example, have one or more cooperating microprocessors or the like for this purpose.
- the sensors include radar sensors and / or lidar sensors.
- radar waves preferably have wavelengths of a few millimeters to centimeters. Radar offers complete volume illumination without blind spots, but with a comparable sensor size, a lower selectivity is achieved than with lidar. But radar is robust in almost all weather conditions, such as rain, snow, fog, darkness or direct sunlight.
- Lidar has the property of being able to display object edges more precisely than radar. However, Lidar has a very limited field of vision and is also more susceptible to adverse weather conditions.
- the two detection technologies can also be advantageously combined in order to combine the various advantages of the different technologies described with one another and to compensate for the disadvantages described.
- the sensors can also have cameras. With cameras, an extensive area can be recorded and monitored simultaneously with good resolution.
- a combination of radar and / or lidar with cameras as sensors can also be used advantageously.
- the environment model data are preferably generated object-based, with attributes being assigned to individual objects.
- the at tributes include information on relevant properties of the detected objects. These properties enable a hazard assessment in connection with a detected object and they allow the behavior of a detected object to be predicted to a certain extent.
- the named attributes of an object can include at least one of the following attribute types:
- Further attributes can include time stamps, an identification number, the length, width and height of a vehicle or object, contours, parameters close to raw data, such as the number of reflection points or a security level or reliability level of transmitted data.
- the reliability of the data can depend, for example, on whether they were estimated or otherwise determined.
- the environment model data include a grid-based occupancy map.
- a grid-based occupancy map also referred to as an "occupancy grid map” divides the monitored area into a plurality of cells, the occupancy probability of which is determined on the basis of sensor data.
- An object can be identified on the basis of the position and number of occupied cells in the grid or grid of the grid-based occupancy map, and a detection probability value of an object can be specified on the basis of the probability of the cells being occupied.
- the surrounding area is divided into sub-areas with different priorities.
- Areas with different priorities can, for example, have different requirements for real-time capabilities of sensor detection, different security requirements and system requirements derived from them.
- a high priority may require the use of different sensors with different performance in different weather conditions, a high level of redundancy in the detection of objects and the processing of sensor data, high real-time requirements and a high level of accuracy and resolution in the area detection.
- a significant proportion of the processing of the sensor data is preferably carried out in each sensor itself, taking into account the specific sensor properties. No further prior knowledge is therefore required to generate the occupancy card. This reduces the amount of data collected by all sensors and the computing load for creating the occupancy card. However, this can be at the expense of the accuracy of the result, since information can be lost during the sensor-internal processing, which information can be helpful for a possible data fusion when creating the occupancy card.
- the grid-based occupancy map is preferably generated on the basis of the object-based environment model data.
- the amount of data for generating the grid-based occupancy rate is relatively small, so that the grid-based occupancy map can be created very quickly.
- the grid-based occupancy card is particularly preferably generated directly on the basis of the sensor data.
- the captured data can be processed particularly precisely, since no intermediate steps can lead to a loss of information.
- the processing takes place here on the basis of sensor data, which are composed in such a way that no information that is helpful for increasing the accuracy of the data fusion is lost due to any preprocessing.
- the amount of data is not necessarily increased in this case, however, certain sensor properties or prior knowledge of the procurement unit of the sensor, such as the resolution, the measurement accuracy, etc., must be known and taken into account during the fusion.
- sensor data recorded on the vehicle side from the area surrounding the vehicle are also used to generate the environment model data.
- a database for the environment model of the vehicle can advantageously be expanded, so that the reliability, resolution and completeness of the environment model is improved.
- FIG. 1 shows a flow diagram which illustrates a method for generating an environment model of an autonomously controlled vehicle according to an exemplary embodiment of the invention
- FIG. 2 shows a schematic representation of an automated transport system
- 3 shows a schematic representation of a stationary monitored straight route with areas with different priorities
- FIG. 5 shows a schematic representation of a layer model of determined environment data
- FIG. 6 shows a schematic representation which illustrates the creation of a digital twin of a node or an intersection
- FIG. 7 shows an environment model generating device according to an exemplary embodiment of the invention
- FIG. 8 shows a schematic representation of a vehicle control device.
- FIG. 1 shows a flow diagram 100 which illustrates a method for generating an environment model MD of an autonomously controlled vehicle F according to an exemplary embodiment of the invention.
- step 1. I sensor data SD are recorded by a plurality of infrastructure-side sensors in an area surrounding the vehicle F.
- the sensors used to acquire the sensor data can include, for example, lidar sensors, radar sensors and cameras.
- the sensors are arranged on sensor masts at regular intervals along a driveway or around an intersection or junction and each monitor a closer environment.
- step 1.II the sensor data SD from different sensors are merged, so that merged sensor data FSD are generated on the basis of the sensed sensor data.
- step 1.III an environment model MD of a street section, for example an intersection or a straight line, becomes stationary Section, based on the merged sensor data. In the context of the environment model, rich objects located in the environment are localized and identified.
- the environment model data MD are also transmitted to the vehicle F.
- the vehicle F enriches its own environment model MDF with the stationary environment model MD, ie it supplements its environment model MDF with the environment model data MD obtained, which were recorded and generated in a stationary manner.
- FIG. 2 shows a schematic representation 20 of an arrangement made up of a vehicle-side and an infrastructure-side sensor system.
- the infrastructure 23 comprises sensor units 22a, 22b with sensors A2, with which a road section 21 is monitored on which a vehicle F is on the way.
- Typical sensors are, for example, lidar sensors, radar sensors and optical sensors, such as cameras, with which data from the environment of the infrastructure is recorded.
- the recorded data is also evaluated, checked for plausibility and exchanged.
- a defined area is monitored on the infrastructure side by the devices 22a, 22b mentioned, which is also referred to as GeoFence. This area is divided into a core area with high priority and an area with low priority. The monitoring of the different areas mentioned meets different requirements in terms of latency, detection probability, etc.
- the sensor units 22a, 22b each have communication antennas A1, with which they can communicate with other units, such as an evaluation device 22d or with the vehicle F, with the aid of a communication unit 22c on the roadside.
- the vehicle F also has a sensor unit A2 with which it can monitor its surroundings.
- the vehicle F also an antenna A1, with which it can communicate with the sensor units 22a, 22b or the evaluation device 22d, if necessary via the infrastructure-side communication unit 22c.
- an autonomous vehicle F has an on-board unit (not shown) for communication, with which it can communicate via the antenna A1 with an infrastructure-side communication unit 22c.
- the infrastructure-side communication unit 22c transmits its information periodically by radio. For example, the IEEE 802.11 standard or car-to-X can be used for transmission.
- the vehicle F receives information relating to an environment model of the infrastructure.
- the information on the surroundings can be transmitted, for example, to all road users in the catchment area of the infrastructure-side communication unit 22c or the traffic cell assigned to it. If an environment model is to be dynamically adapted for individual vehicles, identification can take place via bidirectional communication.
- FIG. 3 shows an example of a straight stretch 30 which is divided into a core area 31a with high priority and an outer area 31b with low priority.
- the core area 31a is formed by the driving area of the route 30 for vehicles F and has particularly high safety requirements. These are met by an increased number of different sensors 22a, 22b that monitor this core area rich.
- the different sensors complement each other through their properties in terms of detection accuracy and availability in different weather conditions and have redundancy in the acquisition and processing of the monitoring data.
- the real-time requirements as well as the accuracy and resolution of the environment detection in the core area are significantly higher than in the Periphery.
- the required sensors and evaluation devices are selected and arranged accordingly.
- a junction 40 is shown in the road network. This also has a core area 41a with high priority and a peripheral area 41b with lower priority. The junction 40 is with the help of three sensor units 22a,
- monitoring data are transmitted from a communication unit 22c by radio to vehicles F in the transmission range.
- FIG. 5 shows a layer model 50 for an environment model with five layers.
- Four-layer models are already known as “local dynamic map”.
- the conventional four-layer model is supplemented by an additional layer S5 can be transferred to a vehicle as a compact representation of a surrounding situation.
- the lowest layer S1 comprises so-called permanent static data, such as map data.
- the second lowest layer S2 comprises transient static data, such as the roadside infrastructure. This includes street signs, landmarks, etc.
- the third layer S3 comprises the transient dynamic data. These include, for example, traffic data and data about the signal phase or data about the current weather-related road conditions.
- the fourth layer S4 comprises highly dynamic data, such as vehicles and pedestrians.
- the traffic participants detected by the sensors such as pedestrians P, cyclists, vehicles F and the associated estimated attributes, such as their position. position, length, speed, detection probability and type.
- the determination of the type ie the distinction between a pedestrian, a cyclist and a vehicle, is carried out by applying a classification algorithm to the recorded sensor data.
- the fifth layer S5 is formed by a grid-like occupancy map.
- the current positions of objects, such as vehicles, are shown as a grid R.
- the data present in layer S4 are converted into the occupancy grid format.
- the raw data from the sensors can also be used directly.
- the monitored area is divided into cells and the probability that the cells are occupied is determined by the infrastructure-side evaluation units and, if necessary, also by the vehicle-side evaluation units. The quality of the determination of the occupation probability and the attributes mentioned is increased by sensor data fusion.
- scene interpretations such as a suitable behavior model for the autonomous vehicle F
- the infrastructural monitoring enables a better assessment of the dynamics in the area under consideration. For this purpose, information about the traffic flow of certain streets and intersections can be collected.
- Layer S1 is static, whereas layers S4 and S5 concern highly dynamic processes.
- the information recorded on the infrastructure side relates in particular to layers S4 and S5. Depending on the dynamics and criticality of the available data, these are passed on to the vehicle at a correspondingly high update rate. For highly dy- Named data is typically at least 15 updates per second.
- Layer 1 data is static and only needs to be transmitted to the vehicle once a day.
- the data of the different layers basically have different update frequencies and therefore also have different requirements with regard to real-time data acquisition and data transmission.
- Layer S5 is used, in particular, for free space modeling with a very small amount of data and thus a high possible update rate and low latency. A vehicle can be informed in real time about a drivable area using data from layer S5.
- FIG. 6 shows a schematic illustration 60 which illustrates the creation of a digital twin of a node or an intersection.
- the detected objects F, M, P are shown as cubic objects.
- This data can then be passed on to an autonomous vehicle for enrichment and expansion.
- the transmission process can take place through a communication medium such as a roadside unit 22c (see also FIG. 2) or with the aid of 4G / 5G data transmission.
- the flow of information can be unidirectional from infrastructure 22a, 22b, 22c, 22d to vehicle F and vice versa, unidirectional from vehicle F to infrastructure 22a, 22b,
- 22c, 22d take place.
- the flow of information can, however, also be bidirectional.
- FIG. 7 shows a schematic representation of a surrounding model generating device 70.
- the environment model generation device 70 comprises infrastructure-side sensors 22a, 22b for acquiring sensor data SD in the vicinity of the sensors.
- the recorded sensor data are transmitted to an evaluation device 22d via a communication unit 22c.
- the evaluation device 22d comprises a stationary fusion unit 71 for merging the sensor data SD and an evaluation unit 72 for the stationary generation of environment model data MD on the basis of the merged sensor data SD.
- a stationary fusion unit 71 for merging the sensor data SD
- an evaluation unit 72 for the stationary generation of environment model data MD on the basis of the merged sensor data SD.
- the evaluation device 22d also includes a stationary communication unit 73 for transmitting the environment model data MD, possibly also via the communication unit 22c, to an autonomously controlled vehicle F.
- FIG. 8 shows a vehicle control device 80 which has a vehicle-side communication unit 81 for receiving environment model data MD from a stationary environment model generation device 70. Part of the vehicle control device 80 is also a control unit 82 for the automated control of the driving behavior of a vehicle F on the basis of an environment model MD generated by the stationary environment model generation device 70.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Optics & Photonics (AREA)
- Traffic Control Systems (AREA)
Abstract
L'invention concerne un procédé pour générer un modèle d'environnement d'un véhicule F à commande autonome. Lors du procédé, des données de capteurs sont acquises par une pluralité de capteurs (22a, 22b) côté infrastructure dans une zone environnante du véhicule F. Les données de capteurs fusionnées sont générées de manière fixe sur la base des données de capteur acquises. Des données du modèle d'environnement sont en outre générées de manière fixe par l'évaluation côté infrastructure des données de capteur, les objets qui se trouvent dans la zone environnante pouvant être localisés et identifiés. Pour terminer, les données du modèle d'environnement sont communiquées au véhicule F. L'invention concerne en outre un procédé de commande autonome d'un véhicule F. L'invention concerne également un dispositif de génération de modèles d'environnement et un dispositif de commande de véhicule. L'invention concerne en plus un véhicule F à commande autonome et un système de transport automatisé. Une jonction d'entrée (40) dans le réseau routier présente une zone centrale (41a) à priorité élevée et une zone périphérique (41b) à priorité plus faible. La jonction d'entrée (40) est surveillée à l'aide d'unités de capteur (22a, 22b). Deux véhicules F se trouvent dans la zone de la jonction d'entrée, et ainsi également dans la zone de surveillance (41a, 41b), lesquels s'approchent chacun de la jonction d'entrée. Une motocyclette M est en plus détectée, laquelle s'éloigne de la zone de surveillance intérieure (41a). En plus de cela, une personne P qui est en train de traverser un passage pour piétons Z est détectée dans la zone de surveillance intérieure (41a). Les données de surveillance sont transmises par une unité de communication (22c) par radiocommunication aux véhicules F qui se trouvent dans la zone de transmission. Des données de capteur de la zone environnante du véhicule, acquises au niveau du véhicule, sont de préférence également utilisées pour générer les données du modèle d'environnement. Une base de données pour le modèle d'environnement du véhicule peut avantageusement être étendue, ce qui permet d'améliorer la fiabilité, la résolution et l'exhaustivité du modèle d'environnement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20725093.7A EP3928127A1 (fr) | 2019-06-25 | 2020-04-21 | Détection de l'environnement côté infrastructure lors de la conduite autonome |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019209154.7A DE102019209154A1 (de) | 2019-06-25 | 2019-06-25 | Infrastrukturseitige Umfelderfassung beim autonomen Fahren |
DE102019209154.7 | 2019-06-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020259892A1 true WO2020259892A1 (fr) | 2020-12-30 |
Family
ID=70680458
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2020/061065 WO2020259892A1 (fr) | 2019-06-25 | 2020-04-21 | Détection de l'environnement côté infrastructure lors de la conduite autonome |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3928127A1 (fr) |
DE (1) | DE102019209154A1 (fr) |
WO (1) | WO2020259892A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102022211436A1 (de) | 2022-10-28 | 2024-05-08 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zur Erstellung eines Fahrzeugmodells für ein Kraftfahrzeug, insbesondere für einen LKW |
DE102023108153A1 (de) | 2023-03-30 | 2024-10-02 | Daimler Truck AG | Verfahren zur Lokalisierung einer Schallquelle |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102020211478A1 (de) | 2020-09-14 | 2022-03-17 | Robert Bosch Gesellschaft mit beschränkter Haftung | Konzept zum Unterstützen eines zumindest teilautomatisiert geführten Kraftfahrzeugs |
DE102020213661A1 (de) | 2020-10-30 | 2022-05-05 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Analysieren eines Umfelds eines Kraftfahrzeugs |
DE102021203994A1 (de) | 2021-04-21 | 2022-10-27 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Betreiben eines wenigstens teilweise automatisierten Fahrzeugs |
DE102021206319A1 (de) | 2021-06-21 | 2022-12-22 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum infrastrukturgestützten Assistieren eines Kraftfahrzeugs |
DE102021207997A1 (de) | 2021-07-26 | 2023-01-26 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Überprüfen einer Vollständigkeit eines Modells einer Verkehrsdynamik an einem Verkehrsknotenpunkt |
DE102021208616A1 (de) | 2021-08-06 | 2023-02-09 | Siemens Mobility GmbH | Anordnung von infrastrukturseitiger Überwachungssensorik |
DE102021209699A1 (de) | 2021-09-03 | 2023-03-09 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Betreiben von mehreren Infrastruktursystemen |
DE102021209680A1 (de) | 2021-09-03 | 2023-03-09 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zur Verbesserung der Schätzung von Existenzwahrscheinlichkeiten |
DE102021213819A1 (de) | 2021-12-06 | 2023-06-07 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Bestimmen einer Datenübertragungsqualität einer Kommunikationsverbindung zwischen vernetzten Kraftfahrzeugen und einem Infrastruktursystem zur Fahrunterstützung von zumindest teilautomatisiert geführten vernetzten Kraftfahrzeugen |
DE102021213818A1 (de) | 2021-12-06 | 2023-06-07 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Bestimmen einer Datenübertragungsqualität eines Infrastruktursystems zur Fahrunterstützung von zumindest teilautomatisiert geführten vernetzten Kraftfahrzeugen |
DE102022201280A1 (de) | 2022-02-08 | 2023-08-10 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren und Vorrichtung zum Betreiben eines Infrastruktursensorsystems |
DE102022203289A1 (de) | 2022-04-01 | 2023-10-05 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren und Vorrichtung zum Erkennen von Fehlausrichtungen eines stationären Sensors sowie stationärer Sensor |
DE102022206981A1 (de) | 2022-07-08 | 2024-01-11 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zur Fahrunterstützung eines vernetzten Kraftfahrzeugs |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130307981A1 (en) * | 2012-05-15 | 2013-11-21 | Electronics And Telecommunications Research Institute | Apparatus and method for processing data of heterogeneous sensors in integrated manner to classify objects on road and detect locations of objects |
WO2019015927A1 (fr) * | 2017-07-21 | 2019-01-24 | Robert Bosch Gmbh | Dispositif et procédé pour produire une information d'état d'un système de voiturier automatique |
US20190096238A1 (en) * | 2017-06-20 | 2019-03-28 | Cavh Llc | Intelligent road infrastructure system (iris): systems and methods |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102016205139B4 (de) * | 2015-09-29 | 2022-10-27 | Volkswagen Aktiengesellschaft | Vorrichtung und Verfahren zur Charakterisierung von Objekten |
DE102016214470B4 (de) * | 2016-08-04 | 2023-06-22 | Volkswagen Aktiengesellschaft | Verfahren und System zum Erfassen eines Verkehrsumfeldes einer mobilen Einheit |
-
2019
- 2019-06-25 DE DE102019209154.7A patent/DE102019209154A1/de not_active Withdrawn
-
2020
- 2020-04-21 EP EP20725093.7A patent/EP3928127A1/fr active Pending
- 2020-04-21 WO PCT/EP2020/061065 patent/WO2020259892A1/fr unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130307981A1 (en) * | 2012-05-15 | 2013-11-21 | Electronics And Telecommunications Research Institute | Apparatus and method for processing data of heterogeneous sensors in integrated manner to classify objects on road and detect locations of objects |
US20190096238A1 (en) * | 2017-06-20 | 2019-03-28 | Cavh Llc | Intelligent road infrastructure system (iris): systems and methods |
WO2019015927A1 (fr) * | 2017-07-21 | 2019-01-24 | Robert Bosch Gmbh | Dispositif et procédé pour produire une information d'état d'un système de voiturier automatique |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102022211436A1 (de) | 2022-10-28 | 2024-05-08 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zur Erstellung eines Fahrzeugmodells für ein Kraftfahrzeug, insbesondere für einen LKW |
DE102023108153A1 (de) | 2023-03-30 | 2024-10-02 | Daimler Truck AG | Verfahren zur Lokalisierung einer Schallquelle |
Also Published As
Publication number | Publication date |
---|---|
DE102019209154A1 (de) | 2020-12-31 |
EP3928127A1 (fr) | 2021-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020259892A1 (fr) | Détection de l'environnement côté infrastructure lors de la conduite autonome | |
EP3688742B1 (fr) | Système pour produire et/ou actualiser un modèle numérique d'une carte numérique | |
EP3572293B1 (fr) | Procédé d'aide à la conduite d'au moins un véhicule automobile et système d'assistance | |
EP3363005B1 (fr) | Procédé de détermination et de mise à disposition d'une banque de données liée à un environnement prédéfini, contenant des données d'environnement | |
DE102013107959B4 (de) | Verfahren zur Unterstützung des Parkens von Fahrzeugen auf einer Parkfläche, Parkflächenmanagementsystem, fahrzeugseitiges System und Computerprogramm dafür | |
DE102020111682A1 (de) | Systeme und verfahren zum implementieren einer autonomen fahrzeugreaktion auf ein sensorversagen | |
DE112020004587T5 (de) | Verteilter verkehrssicherheitskonsens | |
DE102017217443B4 (de) | Verfahren und System zur Bereitstellung von Trainingsdaten zum maschinellen Lernen für ein Steuerungsmodell einer automatischen Fahrzeugsteuerung | |
DE102018129048A1 (de) | Systeme und verfahren zum bestimmen von sicherheitsereignissen für ein autonomes fahrzeug | |
DE112017003968T5 (de) | Steuerung eines autonomen fahrzeugs | |
DE102015209947A1 (de) | Evakuierungsfahrtassistenzvorrichtung | |
DE102014220681A1 (de) | Verkehrssignalvorhersage | |
DE102014220678A1 (de) | Fahrzeug-fernüberwachung | |
DE102016212700A1 (de) | Verfahren und System zur Steuerung eines Fahrzeugs | |
EP2936470B1 (fr) | Procédé et système pour apprendre d'événements de trafic ainsi que l'utilisation dudit système | |
EP2858039A1 (fr) | Procédé de contrôle automatique de l'arrivée d'un véhicule routier dans une section de route contrôlée, système de contrôle et son système côté véhicule et programme informatique | |
WO2017102623A1 (fr) | Procédé et dispositif de prédiction du déplacement d'un usager de la route dans un zone de trafic | |
DE102019209552A1 (de) | Verfahren zur Verkehrserfassung | |
DE102018124578A1 (de) | System und verfahren zur dynamischen fahrzeuganpassung und zum -tuning | |
DE112019004772T5 (de) | System und Verfahren zum Bereitstellen von unterstützenden Aktionen zur gemeinsamen Straßenbenutzung | |
DE102013107960B4 (de) | Verfahren zur Aktualisierung einer Datenbasis sowie Einrichtung und Computerprogramm | |
DE112019000969T5 (de) | Informationsverarbeitungssystem und Informationsverarbeitungsverfahren | |
WO2018010875A1 (fr) | Procédé et dispositif de création d'une carte de danger pour identifier au moins un emplacement dangereux pour véhicule | |
DE102019117136A1 (de) | Kommunikation und steuerung für verkehrsinfrastruktur | |
DE102019116962A1 (de) | Transportinfrastrukturkommunikation und -steuerung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20725093 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020725093 Country of ref document: EP Effective date: 20210923 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |