CN115938142A - Method, device and infrastructure system for driving assistance for a vehicle in an infrastructure - Google Patents

Method, device and infrastructure system for driving assistance for a vehicle in an infrastructure Download PDF

Info

Publication number
CN115938142A
CN115938142A CN202211040990.2A CN202211040990A CN115938142A CN 115938142 A CN115938142 A CN 115938142A CN 202211040990 A CN202211040990 A CN 202211040990A CN 115938142 A CN115938142 A CN 115938142A
Authority
CN
China
Prior art keywords
information
vehicle
occlusion
infrastructure
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211040990.2A
Other languages
Chinese (zh)
Inventor
M·鲍姆格特纳
A·格拉尔迪
P·鲍曼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of CN115938142A publication Critical patent/CN115938142A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0965Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages responding to signals from another vehicle, e.g. emergency vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)

Abstract

According to a first aspect of the invention, a method for driving assistance in an infrastructure for an at least partially automated guided vehicle is proposed. First environmental information of a region to be monitored, in particular a statically defined region, is detected by means of an environment sensor of the infrastructure. According to the invention, occlusion information is derived from the first environment information, wherein the occlusion information relates to regions which are currently not detectable by the environment sensor device of the infrastructure due to, in particular, temporary dynamic occlusion. The occlusion information and/or the first environment information is provided to the vehicle for use.

Description

Method, device and infrastructure system for driving assistance for a vehicle in an infrastructure
Technical Field
The invention relates to a method, a device and an infrastructure system for driving assistance via an infrastructure for an at least partially automated guided vehicle.
Background
A driving assistance system for indicating augmented reality, which has a display unit, a position determining unit, a receiving unit, and an evaluation unit, is known from the publication DE 10 2014 208 A1. The receiving unit is arranged for receiving a first image signal containing an environmental area obscured by an environmental object from the driver's point of view, and the position finding unit is arranged for finding the head pose of the driver and/or the pose of the display unit. The analysis processing unit is arranged for calculating a second image signal corresponding to a virtual view of the driver for the surrounding area. The display unit is in turn arranged for generating a partially transparent representation of the second image signal corresponding to the virtual view of the driver.
The subject matter of the publication DE 10 2012 022 717 A1 is the assistance of a motor vehicle user by means of a virtually elevationally settable seat position. In order to make it easier for the driver to perceive the surroundings of the vehicle, a map of the surroundings of the motor vehicle is determined. A virtual destination is predefined which differs from the actual destination assumed by the user at the seat position in the motor vehicle. The content is represented by the virtual destination in a map representation.
As also in other technical fields, networking plays an increasingly critical role in vehicle applications. More and more vehicles possess the possibility to connect with other traffic participants, infrastructure components (e.g. so-called road side units, "RSUs") or with backend services in the cloud.
In recent times, in particular, networking of vehicles with infrastructure systems has gained an increasingly critical position. The infrastructure-side system can assist the automatically guided vehicle in its driving task, for example, by the infrastructure sensors or data servers at the road edge providing additional information which cannot be generated or can be generated only to a limited extent from the on-board sensor system itself. These additional information relate in particular to information about objects in the vehicle environment. Such assistance can be achieved to the extent that direct intervention in the driving function is possible, for example, if an autonomous or highly automated vehicle can no longer or only within limits be driven independently on account of its limited environmental field of view. By networking via so-called V2X communication (i.e. Vehicle to Vehicle, i.e. Vehicle to Vehicle or Vehicle to infrastructure) with the infrastructure side, it is possible to improve not only Vehicle traffic safety but also traffic efficiency.
A difficulty in using infrastructure-based environmental sensor systems is that some areas of the environment to be detected by the infrastructure are obscured, i.e. not visible to the infrastructure sensors. Therefore, environmental information from these areas cannot be generated. For example, it may be that the region of the traffic lane is occluded from the field of view of one or more environment sensors of the infrastructure by moving vehicles, wherein the occlusions change dynamically, i.e. with the movement of the vehicle which causes them. The infrastructure system cannot in any case evaluate whether the missing information from the occluded area is critical for the receiving vehicle. Since on the one hand the infrastructure is not able or has to know in every system design the receiving vehicle, the position and driving data of these vehicles and on the other hand the criticality of the occlusion can only be decided by the receiving vehicle if it knows its planned trajectory and its own sensor data only.
Partially or fully automated vehicles furthermore usually have a vehicle-side environment sensor for carrying out driving tasks. The environment sensing means may be limitedly available due to driving surroundings, speed, weather, etc. depending on the situation, but may be used in addition to data from the infrastructure, depending on the situation.
Disclosure of Invention
In environments where vehicle sensors can only be utilized to a limited extent or are insufficient for reliably completing driving tasks (for example, a section of a highway cannot be completely viewed from the perspective of the vehicle for structural reasons), a geographically statically placed infrastructure system should provide the necessary information content for vehicles that are dependent on assistance, guided at least in part automatically, in order to reliably complete driving tasks. The vehicle is dynamically moving within an area covered by the infrastructure's environment sensing devices, such as a segment of a highway. From the perspective of the infrastructure system, dynamic occlusions within the area monitored by the infrastructure's environmental sensing devices may result from the motion of the vehicle.
It is not practical to install infrastructure sensing devices that are completely and permanently unobstructed.
Thus, the following possibilities are clarified in the present invention: to be able to communicate infrastructure-side occlusion to assisted at least partially automated vehicles, in particular highly automated or fully automated vehicles, and to be able to cope with such incomplete dynamic scene data, in order to ensure safe driving.
Occlusion information is understood here to mean, in particular, information which identifies the following regions in the vehicle environment which are relevant and necessary: the area is currently undetectable by the infrastructure's environmental sensing devices due to occlusion. The occlusion information may for example comprise the size or dimension and/or the position of the occlusion region. The occlusion information may additionally comprise dynamic quantities, i.e. quantities describing the temporal progression of the occlusion region.
Dynamic occlusion or occlusion information is understood as occlusion information that contains a temporal component, i.e. the temporal progression of the occlusion information.
According to a first aspect of the invention, a method for driving assistance via an infrastructure for an at least partially automated guided vehicle is proposed. First environmental information relating to a region to be monitored, in particular, which is defined statically, is detected by means of an environment sensor of the infrastructure. According to the invention, occlusion information is derived from the first environment information, wherein the occlusion information relates to an area that currently cannot be detected by the environment sensing means of the infrastructure due to, in particular, temporary dynamic occlusion. The occlusion information and/or the first environment information is provided to the vehicle for use.
According to a second aspect of the invention, a device for driver assistance in an infrastructure for an at least partially automated guided vehicle is proposed, wherein the device is designed for carrying out the method according to the first aspect of the invention, and the device comprises for this purpose:
-a receiving unit for receiving first environmental information about an area to be monitored, said first information being detected by means of an environment sensing means of the infrastructure;
a computing unit configured to derive occlusion information from the first environment information, wherein the occlusion information relates to a region that currently cannot be detected by an environment sensing device of an infrastructure due to occlusion;
-a control unit configured for guiding the vehicle at least partially automatically in dependence of said occlusion information.
According to a third aspect of the invention, an infrastructure system for an infrastructure is proposed, which infrastructure system is designed for driving assistance for a vehicle which is guided at least partially automatically. The infrastructure system includes a transmitting unit and an environmental sensing device having at least one environmental sensor. The environment sensor device is designed to detect first environmental information about a region to be monitored by the infrastructure, in which region a vehicle is moved, said vehicle being guided at least partially automatically. According to the invention, occlusion information can be derived from the first environmental information, wherein these occlusion information relate to areas to be monitored by the infrastructure which are currently not detectable by the environmental sensor means of the infrastructure due to occlusion. And transmitting the shielding information and/or the first environment information to the vehicle through a transmitting unit.
The invention is therefore based on the idea of: current dynamic occlusion information is made available to at least partially automated guided vehicles currently moving within an area monitored by an infrastructure. Such occlusion information can be determined by means of environmental information obtained by environmental sensors of the infrastructure or of the infrastructure system. The at least partially automatically guided vehicle can thus utilize these occlusion information in order to match the at least partially automated driving function and thus enable a reliable use of the first environmental information in the vehicle. For example, a vehicle may travel slower as it approaches an obscured area, i.e., an area that is not currently visible to the infrastructure's environmental sensing devices because, for example, it is obscured by the vehicle. For example, the trajectory of the at least partially automated guided vehicle can be planned in such a way that it does not pass through such an occlusion region.
By incorporating such scene data, including occlusion information, generated by the infrastructure's environment sensing devices according to the present invention, the operational safety Domain (ODD) of the automatically guided vehicle can be expanded. Failure or inadequacy of the infrastructure system without the present invention may result in degradation of vehicle functionality.
According to a preferred embodiment of the invention, the first context information comprises at least first information about a static field of view of context sensor means of the infrastructure and second information about objects within the field of view of the context sensor means and third information about the open space within the field of view of the context sensor means. Occlusion information may be generated from the first information, the second information, and the third information.
This means that, in order to advantageously ensure that the at least partially automatically guided vehicle can be provided with all information for its carrying out an at least partially automatically guided safe driving within the infrastructure, i.e. in order to correctly evaluate the situation from the "driver perspective", the infrastructure system can additionally transmit the entire static field of view and all visible free spaces in addition to the objects in the current scene or driving situation. From this information, occlusion information can be derived, for example, by relating the space and the object to the entire field of view, and thus invisible occlusion regions can be recognized. With this information, the vehicle can trust the data sent by the infrastructure in order to be able to travel safely in the area, in part automatically, in particular autonomously.
For example, second information about objects within the field of view of the environment sensor device and third information about free space within the field of view of the environment sensor device may be transmitted to the vehicle. First information about a static field of view of an environment sensing device of an infrastructure is known to a vehicle. Therefore, only the occlusion areas are implicitly transmitted by: regions of the known field of view of the environment sensor device which are not associated with either the object or the open space are identified as occlusion regions.
Alternatively, the area that can be reliably detected can be determined dynamically using the first information about the static field of view, the second information about the object in the field of view of the environment sensor device and the third information about the space in the field of view of the environment sensor device by means of a computing unit of the infrastructure, and the vehicle is therefore provided with occlusion information. The identified occlusion region has therefore been removed from the field of view of the environment sensor device in the infrastructure and the remaining region is transmitted as a "dynamic safety region".
Alternatively or additionally, it can be provided that the occlusion information is directly determined by the environment sensor of the infrastructure in the form of an occlusion region and made available to the vehicle. Decisions can thus be made based on information from the infrastructure system (static field of view, objects and open space) and the vehicle's own data in the vehicle: how the reported invisible area should be handled (implicitly or explicitly).
According to a further embodiment of the invention, the second environmental information about the current environment of the vehicle can preferably be detected by means of an environmental sensor device of the vehicle. Occlusion information may additionally be determined from the second environment information. In this case, the surroundings sensor of the vehicle usually has a different field of view than the surroundings sensor of the infrastructure. Occlusion information can thus be determined, for example, by comparison.
In a preferred embodiment, the reliably detectable area can be determined dynamically by a computing unit of the infrastructure from first information about a static field of view of an environment sensor of the infrastructure and second information about objects in the field of view of the environment sensor and third information about the open space. This has the advantage that the vehicle can be used directly by providing the following information: which areas are reliably detected at what time without the environmental information and/or occlusion information provided for use also having to be analyzed and processed in the vehicle itself.
The environmental information and/or the occlusion information can preferably be made available to the vehicle as a list and/or in the form of a grid, the grid of which is associated with characteristics representing these environmental information and/or occlusion information.
For example, the first environmental and/or occlusion information can be transmitted from the infrastructure system to the vehicle by means of a V2X message, wherein the V2X message comprises a list of objects and information about the free space and the respectively associated confidence range. In this sense, the V2X Message may include, for example, a CPM (Collective Perception Message). Here, the vehicle or infrastructure system also sends an object list with all objects and vehicles perceived in its environment. Other formats and standards for V2X messages, such as may be used within the scope of the present invention, are well known to the skilled person.
The transmitted first environment information and/or occlusion information can preferably be combined with second environment information generated by an environment sensor of the vehicle to form a data set, wherein the remaining occlusion regions, in particular dynamic occlusion regions, in the environment of the vehicle can be determined from the combined information.
The current speed and/or acceleration of the vehicle can now preferably be adapted as a function of the occlusion information, wherein the adaptation takes place in particular as a function of the distance of the vehicle from the occlusion region represented by the occlusion information.
It is therefore preferable that the behavior of the vehicle can be controlled as a function of the occlusion regions and objects and the open space in the vehicle environment.
In one embodiment of the invention, at least the transmitting unit is comprised by the RSU. This results, for example, in technical advantages: the method can be carried out efficiently. The RSU may comprise further components, for example comprising a calculation unit for determining occlusion information.
The abbreviation "RSU" stands for "road side unit". The term "roadside unit" may be translated in german as "stra β enseiitie Einheit" or "stra β enseiitie infraratuktureinheit". The following terms may also be used synonymously in place of "RSU": a road side unit, a road side infrastructure unit, a communication module, a road side radio unit, a road side transmitting station.
The expression "at least partially automated guided" includes one or more of the following cases: in particular for assisted guidance, partially automated guidance, highly automated guidance and fully automated guidance of motor vehicles.
Assisted guidance means that the driver of the motor vehicle continuously carries out a transverse guidance or a longitudinal guidance of the motor vehicle. The control unit is automatically configured to execute the control unit in response to another driving task (i.e., to control the longitudinal guidance or the transverse guidance of the motor vehicle). This means that either the transverse guidance or the longitudinal guidance is automatically controlled when the motor vehicle is guided in an assisted manner.
Partially automated guidance means that the longitudinal guidance and the transverse guidance of the motor vehicle are automatically controlled under certain conditions (for example: driving on a motorway, driving in a parking lot, passing an object, driving in a lane specified by a lane marking) and/or for a certain period of time. The driver of the motor vehicle does not have to manually control the longitudinal guidance and the transverse guidance of the motor vehicle itself. However, the driver must constantly monitor the automated control of the longitudinal and transverse guides in order to be able to intervene manually if necessary. The driver must be ready to fully take over the motor vehicle guidance at any time.
Highly automated guidance means that the longitudinal guidance and the transverse guidance of the motor vehicle are automatically controlled for a specific time period under specific conditions (e.g. driving on a motorway, driving in a parking lot, passing an object, driving in a lane specified by a lane marking). The driver of the motor vehicle does not have to manually control the longitudinal guidance and the transverse guidance of the motor vehicle itself. The driver does not have to constantly monitor the automatic control of the longitudinal guidance and the transverse guidance in order to be able to intervene manually if necessary. If necessary, a request for taking over is automatically output to the driver in order to take over the control of the longitudinal guidance and the transverse guidance, in particular with a sufficient time margin. Thus, the driver must potentially be able to take over control of the longitudinal guidance and the lateral guidance. The limits of the automation of the transverse guidance and the longitudinal guidance are automatically identified. In the case of highly automated guidance, it is not possible to achieve a state with minimal risk automatically in the case of various first situations.
Fully automated guidance means that the longitudinal guidance and the transverse guidance of the motor vehicle are automatically controlled under certain conditions (e.g. driving on a motorway, driving in a parking lot, passing an object, driving in a lane specified by a lane marking). The driver of the motor vehicle does not have to manually control the longitudinal guidance and the transverse guidance of the motor vehicle itself. The driver does not have to monitor the automated control of the longitudinal guidance and the transverse guidance in order to be able to intervene manually when necessary. Before the end of the automated control of the lateral guidance and the longitudinal guidance, the driver is automatically requested to take over the driving task (control of the lateral guidance and the longitudinal guidance of the motor vehicle), in particular with a sufficient time margin. If the driver does not take over the driving task, it is automatically returned to the state of minimum risk. The limits of the automation of the transverse guidance and the longitudinal guidance are automatically identified. In all cases, an automated return to the system state with the least risk is possible. The limits of the automation of the longitudinal and transverse guidance are automatically detected. In any case, an automated return to the system state with the lowest risk can be achieved.
By unmanned or driving is meant that the longitudinal and lateral guidance of the vehicle is controlled automatically, regardless of the particular application (e.g. driving on a highway, driving in a parking lot, passing a vehicle, driving in a lane defined by lane markings). The occupants of the motor vehicle do not have to manually control the longitudinal and transverse guidance of the motor vehicle. The occupant does not have to monitor the automated control of the longitudinal and transverse guidance in order to be able to intervene manually if necessary. Thus, the longitudinal and lateral guidance of the vehicle can be controlled automatically under all road types, speed ranges and environmental conditions, for example. Thus, the entire driving task of the driver is automatically taken over. Therefore, the driver is no longer required. Thus, the vehicle can be driven from any starting position to any destination position even without a driver or passenger. Potential problems are solved automatically, i.e. without the help of the driver or passengers.
Remote control of the motor vehicle means remote control of the transverse guidance and longitudinal guidance of the motor vehicle. I.e. for example to send remote control signals to the motor vehicle for remote control of transverse guidance and longitudinal guidance. The remote control is performed, for example, by means of a remote control device.
Device features likewise derive from corresponding method features and vice versa. That is to say that this means that the technical functionality of the method derives from the corresponding technical functionality of the device and vice versa. The same applies to system features which are likewise derived from method features and/or device features, and vice versa.
According to one specific embodiment, the environment sensor system of the infrastructure comprises one or more environment sensors which are arranged spatially distributed, in particular statically, within the infrastructure.
The environmental sensor is, for example, one of the following environmental sensors: radar sensors, ultrasonic sensors, video sensors, infrared sensors or magnetic field sensors.
The information detected by the environment sensors of the infrastructure includes, for example, sensor data of a single environment sensor and/or data generated by sensor data fusion of data of at least two environment sensors.
The information detected or generated by the environment sensor of the infrastructure is generated, for example, in the form of an object list, wherein, for example, a position and/or a speed is determined for each detected object and entered into the object list.
Drawings
Embodiments of the present invention are described in detail with reference to the accompanying drawings.
Fig. 1 shows a schematic view of a highway section with an infrastructure system according to one embodiment of the present invention.
Fig. 2 shows a decision tree according to an exemplary embodiment of the method according to the present invention for driving assistance in an infrastructure for an at least partially automated guided vehicle.
Fig. 3 schematically illustrates a possible signal flow of an embodiment of the method according to the invention for driving assistance in an infrastructure for an at least partially automated guided vehicle.
Fig. 4 shows an arrangement for driving assistance in an infrastructure for a vehicle guided at least partially automatically according to an embodiment of the invention.
Fig. 5 shows an embodiment of an infrastructure system according to the invention, which is designed for driving assistance for an at least partially automated guided vehicle.
Detailed Description
In the following description of the embodiments of the present invention, the same elements are denoted by the same reference numerals, and repeated description of these elements is omitted as necessary. The figures depict the subject matter of the invention only schematically.
Fig. 1 shows a top view of a highway section 17 with an infrastructure system 10. The infrastructure system 10 includes four environmental sensors 12a, 12b, 13a and 13b arranged on a gantry sign 18 above the roadway. In this example, the environmental sensors 12a and 12b are camera sensors. In this example, the environmental sensors 13a and 13b are radar sensors. More environmental sensors and/or other or additional sensor types, such as lidar sensors, may additionally or alternatively be used. The environmental sensors 12a, 12b, 13a, 13b together form an environmental sensor device 15 of the infrastructure system 10.
Within the highway section 17, there are a plurality of vehicles 20, 40, 50 in the illustrated situation, wherein the vehicle 20 is configured in the example as a highly automated vehicle whose automated driving functions are to be assisted by the infrastructure system 10. The infrastructure system 10 is designed to detect first environmental information about the area 30 to be monitored by the infrastructure by means of the environment sensor device 15. These environmental information can be transmitted wirelessly as a V2X message to the vehicle 20 by means of the transmitting unit 60 of the infrastructure system 10.
The environment sensing device 15 covers the field of view 30 of the highway section 17. The environment sensing device 15 detects the vehicles 20, 40, and 50 as objects within the field of view 30, and can determine object characteristics such as position, speed, direction of motion, and the like, for example. Furthermore, the environment sensor device 15 may detect information about the open space 31 within the field of view 30 of the environment sensor device 15, i.e. the ground: no objects are on these grounds and these grounds can be seen by the environment sensing means 15.
Occlusion information, such as an occlusion region 32, which is currently not detectable by the environment sensing means 15 of the infrastructure due to occlusion, can now be derived from the first environment information, i.e. the field of view 30, i.e. the objects 20, 40 and 50, detected by the environment sensing means 15 and the information of the detected open space 31. The occlusion information can also be transmitted explicitly or implicitly by the transmitting unit 60 to the vehicles in the field of view 30 of the highway section 17.
The vehicle 20 has its own surroundings sensor system 22, with which the direct surroundings of the vehicle 20 can be detected and second environmental information about the current surroundings of the vehicle 20 can be detected. The environmental sensor system 22 of the vehicle comprises, for example, one or more cameras, lidar sensors and/or radar sensors. As stated above, these environmental information alone are not sufficient for addressing driving tasks (e.g., to drive at higher speeds) within the current operational safety workspace (ODD).
To solve this driving task, first environmental information is now transmitted to the vehicle 20 by the transmitting unit 60 of the infrastructure system 10, which first environmental information comprises information about the field of view 30, i.e. the open space 31, and the objects 20, 40, 50. Additional occlusion information can be calculated therefrom and the driving strategy can be adapted, for example, as a function of the relative position and speed of the vehicle 20 relative to the occlusion region 32 thus determined. For this purpose, the vehicle has, for example, a computing unit for condition analysis.
Fig. 2 shows a decision tree 100 according to an exemplary embodiment of the method according to the invention for driving assistance in an at least partially automated guided vehicle in an infrastructure, which shows how this decision tree is executed in the vehicle 20 by a corresponding computing unit. First, it is checked in step 110 whether an occlusion region 32 is detected in an environment of significant relevance on the lane traveled by the vehicle 20. The performance decision 120 is then made based on whether the obscured area 32 is within a defined near zone of the vehicle (e.g., within 35 meters) provided that the vehicle's own sensing devices are available within the zone. If this is the case, so-called qualifying environmental information detected by the environmental sensing devices 22 of the vehicle 20 is used to calculate the driving strategy of the vehicle 20, according to step 130. In this context, "qualified" may mean that the detected environmental information is sufficiently available and reliable to build driving behavior based thereon.
The occlusion is then considered to be controllable according to the result 190.
If no occlusion region 32 is detected within a defined near region of the vehicle 20, a further situation decision 140 follows, in which it is checked whether the vehicle's environment sensing means are also able to detect environmental information (so-called "far distance data") at a greater distance from the vehicle, which is generally not considered "qualified" for distance reasons, and the availability and reliability must be checked during the journey. If this information is available, this environmental information detected by the vehicle's environmental sensor system (so-called unqualified environmental information about the area at a greater distance from the vehicle 20) can be used to compensate for the obscured area of the infrastructure's environmental sensor system, and thus to calculate the driving strategy of the vehicle 20, according to step 150. The occlusion is then considered to be controllable according to the result 190.
If the vehicle sensing device cannot detect any environmental information at a greater distance from the vehicle, then it is determined in step 160 that the vehicle 20 initially maintains its speed and does not perform acceleration. A case decision 170 is then performed in which it is checked whether the occlusion is shorter than a certain tolerance time, e.g. 0.5 seconds. If this is the case, the occlusion is considered to be controllable according to the result 190. If the grace period is exceeded or the occlusion is considered too critical by the vehicle 20 according to the crisis assessment, the distance of the vehicle 20 to the occluded area is increased in step 180, for example by braking or trajectory change. This increase in distance can be made in particular on the basis of a crisis assessment of the occlusion region 32. When a larger distance is reached, the occlusion is considered controllable according to the result 190. The criticality assessment of the obscured area 32 may, for example, introduce the dynamics of the obscured area 32 and/or the size or size and/or location of the obscured area 32 relative to the vehicle 20, and may have been performed by the vehicle 20 or the infrastructure system 10 in advance.
Thus, the vehicle 20 utilizes its own, sufficiently qualified, generated environmental information for compensating for occlusions in the near zone. For areas beyond the near zone, the vehicle 20 should utilize its own data (if present) as effectively "insufficient" for that zone to compensate for infrastructure occlusions. In terms of security, the use of unqualified or only partially qualified data can be offset by a small occlusion probability.
If no vehicle data is available for the occlusion region (occlusion is outside a large area around the vehicle, e.g., 35 meters), the vehicle may tolerate occlusion with the speed held constant. The allowance time in this example is assumed to be 0.5 seconds and can be matched depending on the conditions or speed of the vehicle relative to the occluded area. After the allowance time has ended, the vehicle must react to the occlusion in response to the crisis and establish a greater distance to the occlusion area with a comfortable delay. For this purpose, for example, occlusion information of the relevant region of interest in the direction of travel can be forwarded from the sensing section, i.e. the environment sensor device of the vehicle 20, to the situation analysis unit of the vehicle 20.
However, the shielding is in any case considered to be controllable in order to enable safe, at least partially automated operation of the vehicle 20.
Fig. 3 schematically shows a signal flow of a method according to an embodiment of the invention. V2X messages 212 implemented as CPM messages are transmitted from the infrastructure system 210 by means of communication units (roadside units "RSUs"). The V2X message 212 includes a first list with environmental information about the object and its characteristics (e.g., position, velocity, direction of motion, size.) and corresponding confidence values that account for the reliability of the respective object characteristics. In addition, V2X message 212 also includes a second list with environmental information about the open space. The free spaces in the second list are likewise each assigned a confidence value. The first list and the second list are generated by means of measurements of environment sensing means (not shown) of the infrastructure system 210. The V2X message 212 may also include additional information about the infrastructure system 210 describing, for example, a static field of view of the environment sensing devices of the infrastructure system 210.
The vehicle 220 receives the information 212 by means of a Communication Control Unit ("CCU"), which vehicle 220 also detects the environmental information by means of its own vehicle environmental sensor system and thus generates a third list 222 which includes the objects in the environment of the vehicle 220 and the properties of said objects, as well as the associated confidence values.
A combined data set 240 is generated by data combination 230 from the information 212 with the first list and the second list and from the third list 222. This combination is made, for example, according to the distance of the infrastructure sensor device from the vehicle, using known sensor data fusion methods or by means of rules. Information received or known via V2X messages 212, such as information about the static field of view of the environment sensing devices of the infrastructure system 210, is also considered herein. The combined data set 240 thus comprises as a result a list with objects, the properties associated with these objects and the associated confidence values. The combined data set 240 further includes a list of the terrain where no information is present (areas where the condition is unknown or "no information areas").
The combined data set 240 is now subjected to a condition analysis 242 on the one hand. In this case, the situation is evaluated on the basis of the available information and the criticality and the performability of possible driving maneuvers are checked. Furthermore, an analysis 244 is carried out on the ground where no information is present, in order to calculate occlusion areas and in particular to carry out a critical assessment of such occlusion information (see fig. 2) with regard to the driving safety of the vehicle 220. The results of both analyses 242 and 244 are incorporated into a combined decision 250 regarding further behavior of the vehicle 220, which ensures the greatest possible safety. This decision 250 and the driving parameters derived therefrom (e.g. speed, acceleration, trajectory) are transmitted to the control unit 260 of the vehicle 220.
In this example, information about the detected object and the open space is transmitted by infrastructure system 210 or its environment sensing devices. In this case, the occlusion information is only implicitly transmitted by: the regions of the known field of view of the infrastructure system 210 which are not associated with either the object or the open space are identified as occlusion parts (i.e. regions of unknown condition).
Alternatively to this, may be:
additionally (i.e. explicitly) transmitting the occlusion part.
Transmitting the occlusion part and for this purpose either not transmitting the object or not transmitting the space.
The occlusion part has been removed from the infrastructure view in the infrastructure and the "dynamic security area" is transmitted.
The elements described so far are either transmitted as a list or in the form of a Grid (english: grid), the cells of which are assigned characteristics.
Fig. 4 shows a device 400 for driving assistance in an infrastructure for an at least partially automated guided vehicle according to an embodiment of the invention, which device is designed to carry out a method according to the first aspect of the invention. To this end, the apparatus 400 comprises: a receiving unit 430 for receiving first environmental information about the current environment of the vehicle, which is detected by means of an environment sensing device of the infrastructure; a calculation unit 440 configured to derive occlusion information from the first environment information, wherein the occlusion information represents an area of the vehicle environment that is currently undetectable by the environment sensing device of the infrastructure due to occlusion; and a control unit 460 configured to at least partly automatically guide the vehicle in dependence of the occlusion information.
Fig. 5 shows an infrastructure system 500 for an infrastructure, which is designed for driving assistance for an at least partially automated guided vehicle, comprising a transmitting unit 560 and an environment sensor arrangement 515 with at least one environment sensor 512, which is designed for detecting first environmental information of an area to be monitored by the infrastructure. Occlusion information can be derived from the first environment information, wherein the occlusion information represents a region of the region to be monitored by the infrastructure that is currently not detectable by the environment sensing device 515 of the infrastructure due to occlusion, and wherein the occlusion information and/or the first environment information is transmitted to the vehicle by the transmitting unit 560.

Claims (13)

1. Method for driving assistance via an infrastructure for a vehicle (20, 220) which is at least partially automatically guided, wherein first environmental information about an area to be monitored is detected by means of an environmental sensor device (15) of the infrastructure, characterized in that occlusion information is derived from the first environmental information, wherein the occlusion information relates to an area (32) in the environment of the vehicle (20, 220) which is currently not detectable by the environmental sensor device (15) of the infrastructure due to occlusion, and wherein the occlusion information and/or the first environmental information is provided for use by the vehicle (20, 220).
2. The method according to claim 1, characterized in that the environment information comprises at least first information about a static field of view (30) of an environment sensing device (15) of the infrastructure and second information about objects (20, 40, 50) within the field of view (30) of the environment sensing device and third information about an open space (31) within the field of view (30) of the environment sensing device (15), wherein the occlusion information is generated from the first information, the second information and the third information.
3. Method according to claim 1 or 2, characterized in that the occlusion region (32) is directly determined by the infrastructure environment sensor device (15) as occlusion information and made available to the vehicle (20, 220).
4. The method according to claim 2 or 3, characterized in that, by means of a computing unit (60) of the infrastructure, a reliably detectable area is determined, in particular dynamically, by means of the first information about a static field of view (30) of an environment sensor device (15) of the infrastructure and/or the second information about an object (20, 40, 50) within the field of view (30) of the environment sensor device and/or the third information about an open space (31), and occlusion information is determined therefrom and provided for use by the vehicle (20, 220).
5. Method according to one of claims 1 to 4, characterized in that the occlusion information and/or the environment information is made available to the vehicle (20, 220) as a list and/or in the form of a grid, the cells of which are assigned characteristics representing the environment information and/or the occlusion information.
6. Method according to one of the preceding claims, characterized in that second environmental information about the current environment of the vehicle is detected by means of an environmental sensor device (22) of the vehicle, wherein the occlusion information (32) is additionally determined from the second environmental information.
7. Method according to one of the preceding claims, characterized in that the first environmental information and/or the occlusion information is transmitted from the infrastructure system (110, 210) to the vehicle by means of a V2X message (212), wherein the V2X message (212) comprises an object list and information about the open space and a respectively associated confidence range.
8. Method according to one of the preceding claims, characterized in that the first environment information and/or occlusion information is combined with second environment information generated by an environment sensor device (22) of the vehicle (20, 220) into a data set (240), wherein occlusion regions (32), in particular dynamic occlusion regions (32), of the infrastructure in the environment of the vehicle (20, 220) are compensated.
9. Method according to any of the preceding claims, characterized in that the current speed and/or acceleration of the vehicle (20, 220) is matched depending on the occlusion information, wherein the matching is in particular made depending on the distance of the vehicle (20, 220) from the occlusion area (32) represented by the occlusion information.
10. The method according to claim 9, characterized in that the behavior of the vehicle (20, 220) is controlled in dependence of occlusion areas (32) and objects (40, 50) and open spaces (31) in the vehicle environment.
11. Method according to one of claims 1 to 10, characterized in that a crisis assessment is carried out for an occlusion region (32), wherein in particular the size and/or the position of the occlusion region (32) relative to the vehicle (20, 220) and/or the dynamics of the occlusion region (32) are taken into account, wherein the vehicle (20, 220) is drive-assisted by the infrastructure according to the crisis assessment.
12. An arrangement for driving assistance in an infrastructure for an at least partially automated guided vehicle (20, 220), the arrangement being configured for implementing the method according to any one of claims 1 to 11, the arrangement comprising:
-a receiving unit for receiving first environmental information about an area to be monitored, the first environmental information being detected by means of an environment sensing device of the infrastructure;
-a computing unit configured for deriving occlusion information from the first environment information, wherein the occlusion information relates to a region that currently cannot be detected by an environment sensing device of the infrastructure due to occlusion;
-a control unit configured for guiding the vehicle at least partially automatically in dependence of the occlusion information.
13. An infrastructure system for an infrastructure, the infrastructure system being configured for driving assistance for an at least partially automated guided vehicle, the infrastructure system comprising a transmitting unit and an environment sensing device with at least one environment sensor, the environment sensing device being configured for detecting first environment information about an area to be monitored by the infrastructure in which the at least partially automated guided vehicle is moving, characterized in that occlusion information is derivable from the first environment information, wherein the occlusion information relates to an area of the area to be monitored by the infrastructure that is currently not detectable by the environment sensing device of the infrastructure due to occlusion, and wherein the occlusion information and/or the first environment information is transmitted to the vehicle by the transmitting unit.
CN202211040990.2A 2021-08-27 2022-08-29 Method, device and infrastructure system for driving assistance for a vehicle in an infrastructure Pending CN115938142A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021209434.1A DE102021209434A1 (en) 2021-08-27 2021-08-27 Method and device for driving support for an at least partially automated vehicle in an infrastructure
DE102021209434.1 2021-08-27

Publications (1)

Publication Number Publication Date
CN115938142A true CN115938142A (en) 2023-04-07

Family

ID=85175533

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211040990.2A Pending CN115938142A (en) 2021-08-27 2022-08-29 Method, device and infrastructure system for driving assistance for a vehicle in an infrastructure

Country Status (3)

Country Link
CN (1) CN115938142A (en)
AT (1) AT525388B1 (en)
DE (1) DE102021209434A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009027755A1 (en) 2009-07-16 2011-01-20 Robert Bosch Gmbh Method for assisting driving of vehicle e.g. cab vehicle, involves indicating instantaneous position of obstacle to driver of vehicle, and encrusting obstacle by analog contact display unit in vision field of driver
DE102012014115A1 (en) 2012-07-17 2013-01-17 Daimler Ag Method for assisting driver during driving vehicle, involves detecting environment of vehicle by detection unit of vehicle, detection unit of other road user or detection unit of infrastructure
DE102012022717A1 (en) 2012-11-21 2013-05-29 Daimler Ag Method for assisting user of motor car, involves determining diagram of environment of motor car in dependence of virtual eye point and displaying diagram with virtual eye point for representing contents perceptible to user
DE102013210729A1 (en) * 2013-06-10 2014-12-11 Robert Bosch Gmbh Method and device for signaling a visually at least partially hidden traffic object for a driver of a vehicle
DE102014208310A1 (en) 2014-05-05 2015-11-05 Bayerische Motoren Werke Aktiengesellschaft Driver assistance system for displaying an augmented reality
DE102017200842B4 (en) 2017-01-19 2020-06-18 Audi Ag Process for operating a traffic control infrastructure and traffic control infrastructure

Also Published As

Publication number Publication date
AT525388B1 (en) 2024-02-15
AT525388A2 (en) 2023-03-15
AT525388A3 (en) 2023-12-15
DE102021209434A1 (en) 2023-03-02

Similar Documents

Publication Publication Date Title
CN110398960B (en) Intelligent driving path planning method, device and equipment
US10248116B2 (en) Remote operation of autonomous vehicle in unexpected environment
US20160318518A1 (en) Travel control apparatus
CN105984342A (en) Driving control device
US11532097B2 (en) Method for estimating the quality of localization in the self-localization of a vehicle, device for carrying out the steps of the method, vehicle, and computer program
US11755022B2 (en) Vehicle control device
CN111824141B (en) Display control device, display control method, and storage medium
CN112673231A (en) Method for updating an environment map, device for carrying out the method steps of the method on the vehicle side, vehicle, device for carrying out the method steps of the method on the central computer side, and computer-readable storage medium
CN111301412A (en) Queue driving system
CN113492849A (en) Driving support device and data collection system
US11427200B2 (en) Automated driving system and method of autonomously driving a vehicle
US11335136B2 (en) Method for ascertaining illegal driving behavior by a vehicle
US11807238B2 (en) Driving assistance system for a vehicle, vehicle having same and driving assistance method for a vehicle
CN115938142A (en) Method, device and infrastructure system for driving assistance for a vehicle in an infrastructure
US11904899B2 (en) Limp home mode for an autonomous vehicle using a secondary autonomous sensor system
CN114026622B (en) Vehicle control device, vehicle control method, and storage medium
US20240134374A1 (en) Remote operation control method, remote operation system, and moving body
EP4357213A1 (en) A method for determining whether an automatic collision avoidance steering maneuver should be executed or not
CN113554864B (en) System and method for post-processing traffic accidents on expressway
EP4019371A1 (en) Vehicle control system and vehicle control method
US11801870B2 (en) System for guiding an autonomous vehicle by a towing taxi
WO2023149089A1 (en) Learning device, learning method, and learning program
CN113409568B (en) Mobile object monitoring system and mobile object monitoring method
EP4102323A1 (en) Vehicle remote control device, vehicle remote control system, vehicle remote control method, and vehicle remote control program
US20220360745A1 (en) Remote monitoring device, remote monitoring system, and remote monitoring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication