CN114074665A - Method for operating an assistance system of a vehicle and assistance system of a vehicle - Google Patents

Method for operating an assistance system of a vehicle and assistance system of a vehicle Download PDF

Info

Publication number
CN114074665A
CN114074665A CN202110918739.0A CN202110918739A CN114074665A CN 114074665 A CN114074665 A CN 114074665A CN 202110918739 A CN202110918739 A CN 202110918739A CN 114074665 A CN114074665 A CN 114074665A
Authority
CN
China
Prior art keywords
surroundings
vehicle
objects
maps
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110918739.0A
Other languages
Chinese (zh)
Inventor
M·扎尔费尔德
N·吉尔巴赫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of CN114074665A publication Critical patent/CN114074665A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a method for operating an auxiliary system of a vehicle, comprising the following steps: -sensing the surroundings of the vehicle, -generating at least two surroundings maps (2a, 2b) at different points in time, wherein each surroundings map (2a, 2b) is divided into a plurality of cells (3), -wherein each cell (3) is assigned a predefined value on the basis of the presence of an object (4) at a location of the surroundings (U) corresponding to the cell (3), -classifying an object (4) sensed in the surroundings (U) of the vehicle (10), -identifying a static object (4a) by means of the classification, and-comparing the at least two surroundings maps (2a, 2b) with one another, wherein the comparison is carried out in accordance with the respective cell (3) of the surroundings maps (2a, 2b) representing the static object (4 a). The invention also relates to an assistance system for a vehicle.

Description

Method for operating an assistance system of a vehicle and assistance system of a vehicle
Technical Field
The invention relates to a method for operating an auxiliary system of a vehicle and to an auxiliary system of a vehicle.
Background
Auxiliary systems for vehicles are known which generate a map of the surroundings, a so-called "occupancy grid map". In such a surrounding map, objects in the surrounding of the vehicle can be mapped. In general, a surrounding map is generated by means of environmental data sensed by an environmental sensing device of a vehicle. In particular, the environment data and the motion information having errors can cause deviations between the representation in the surrounding map and reality.
Disclosure of Invention
In contrast, the method according to the invention has the task of making it possible to provide a map of the surroundings with particularly precise and reliable information. This is achieved by a method for operation of an auxiliary system of a vehicle, comprising the steps of:
-sensing, in particular by means of an environment sensing device of a vehicle, a surrounding environment of the vehicle,
generating at least two surroundings maps at different points in time, in particular having in each case an imaging of the surroundings, wherein each surroundings map is divided into a plurality of cells, wherein each cell is assigned a predefined value based on the presence of an object at a location of the surroundings corresponding to the cell,
-classifying objects sensed in the surroundings of the vehicle,
-identifying static objects by means of said classification, and
-comparing at least two ambient maps with each other.
The comparison is carried out in accordance with the corresponding cells of the surroundings map which represent the static object.
In other words, in this method, two ambient environment maps, which may also be referred to as "occupancy grid maps" in particular, are generated based on the sensed ambient environment. Each ambient map is divided into a plurality of cells, to which predetermined values are respectively assigned. Here, a value is assigned to a cell based on whether an object is present at a location in the vehicle surroundings corresponding to the location of the particular cell. In particular, this enables positioning of the vehicle, preferably with respect to the object. For example, a value may be assigned to a cell as a value that preferably marks whether an object is present or not.
Objects sensed in the surrounding environment of the vehicle are classified. Here, the recognition and classification of objects into different categories is considered as classification. In particular, the objects are divided into static and movable (or non-static) objects. Next, objects classified as static are identified from the classified objects. Next, the following regions of the two surroundings maps are used for comparison of the two surroundings maps: static objects have been identified on the respective ambient locations of the area. Preferably, all other areas of the surroundings map are ignored here. Here, objects whose location in the surroundings does not change over time are considered to be static objects, such as buildings, walls, trees or other structural objects. This means that the comparison of the surroundings map is carried out on the basis of immobile objects which always have the same location in the surroundings, so that, for example, the relative position with respect to these objects can be determined particularly reliably.
Thus, the method provides the following advantages: the unmovable area of the surroundings is used for comparison of the surroundings map. For example, a previously generated first surroundings map in the same area can be compared and confirmed in a reliable and simple manner even when the time interval between driving through the surroundings in a specific area is large. This can be achieved particularly simply and reliably, since, for example, static objects do not change their position in the surroundings. Thus, a surroundings map with a particularly high accuracy can be provided, for example, in order to allow as high a positioning accuracy of the vehicle as possible in the surroundings map.
The contents of the preferred embodiments are preferred embodiments of the invention.
Preferably, the method further comprises the steps of: based on a comparison of at least two surroundings maps with one another, a vehicle is located in at least one of the surroundings maps. Preferably, for this purpose, the first sensing of the first surroundings map is carried out as a so-called "training", for example by means of a manual driver-controlled driving of the vehicle. Here, determining the position and orientation of the vehicle within the surroundings map is considered as a localization. For example, objects are mapped into a first surroundings map by means of a distance measurement method and an environment sensor of the vehicle. Later on, a second ambient map can be generated by means of a second sensing by a second travel in the same area of the ambient environment and compared with the first ambient map. Based on this comparison, the position of the vehicle can therefore be estimated, in particular with respect to objects or the like depicted in the surroundings map. Since the two surroundings maps are compared with one another on the basis of static objects, a particularly precise and reliable localization of the vehicle can be achieved.
Preferably, the method further comprises the steps of: cells representing static objects in the surrounding environment are prioritized. In this case, the at least two ambient maps are compared with one another by means of the prioritized cells. The prioritization of prioritized cells, for example, when evaluating the surroundings map, can be regarded as a priority. Particularly preferably, all cells of the surroundings map can be assigned a weighting factor, which is multiplied, for example, by the value of the cell. The prioritization may include an increase in the weighting factor, so that the values of those cells representing static objects are more important, for example, in the localization of the vehicle, as a result of which a higher accuracy of the obstacle map can be achieved.
Particularly preferably, the method further comprises the steps of: by means of the classification, movable objects in the surroundings of the vehicle are identified. In this case, those cells of the surroundings map which represent the movable object are excluded from the comparison of the at least two surroundings maps. Preferably, those cells representing movable objects are additionally excluded from the vehicle positioning step. This means that the following cells are not used for the comparison of the two surrounding maps: movable objects have been identified on the respective surroundings location of the cell. For example, these cells can be ignored in the comparison. For example, the cells can be assigned a predetermined, in particular additional, value which defines that the respective cell is not taken into account in the comparison. In this way, a particularly high degree of accuracy can be achieved when comparing the surroundings maps and when locating the vehicle in at least one of these surroundings maps, since information that is subject to possible errors due to different positions of the movable object at different points in time of recording the surroundings maps is not taken into account. Furthermore, the method can be carried out particularly simply and resource-effectively, since, for example, for the comparison step and in particular also for the positioning step, a smaller number of objects in the surroundings needs to be sensed and analyzed precisely.
Preferably, the method further comprises the steps of: those cells of the surroundings map which represent movable objects are neutralized (neutralieren). Here, "resetting the cell value to a standard value, in particular, a standard value existing before distribution" is considered to be neutralized. In particular, "resetting a cell in such a way that it represents the state" no object "is regarded as neutral. Alternatively, the cell can also be set to the state "occupied" by this neutralization, i.e., it can be set such that the cell is marked with an object. By neutralizing the cells for the recognized movable objects, the regions of the surroundings map representing the static objects can be highlighted in a particularly simple manner, so that an accurate comparison and an accurate localization can be achieved by simple and cost-effective means.
Preferably, the method further comprises the steps of: by means of the classification, traffic participants in the surroundings of the vehicle are identified. In this case, the identified traffic participant is recognized as a movable object. In this case, humans, animals and vehicles of any type, in particular those located in the surrounding area that can be traveled by means of the vehicle, are considered as traffic participants. Here, each of these objects is considered as a traffic participant, whether it is moving or stationary at the sensing time point. This means that, for example, a parked motor vehicle, whether or not a passenger is present, is recognized as a traffic participant and correspondingly recognized as a movable object.
It is further preferred that the sensing of the surroundings of the vehicle comprises generating camera imaging by means of a camera. In this case, the classification of the objects sensed in the vehicle surroundings is carried out by means of an evaluation of the camera image. The camera is preferably part of an environment sensing device of the vehicle. The classification can be carried out in a simple and reliable manner by the evaluation of the camera images generated by means of the camera.
Particularly preferably, the classification is carried out automatically by means of an evaluation unit. For example, the evaluation unit may be part of a control device of the vehicle or may alternatively be provided in addition to the control device. In particular, the evaluation unit can comprise an algorithm which is provided for analyzing the generated camera images for classifying and identifying the static object and the movable object from the camera images. For example, the evaluation unit can have artificial intelligence, by means of which the classification is carried out.
Preferably, the classification can be carried out and/or confirmed manually by means of a user input of the vehicle user, in particular by means of a human-machine interface, such as a display and/or input device. For example, the generated ambient camera images can be displayed to the user by means of a display and/or input device, wherein the user can select regions of the image and can mark these regions as static or movable. Alternatively or additionally, by means of a display and/or input device, it is possible to display: which regions have been classified as static or movable by the analysis evaluation unit. These areas can be confirmed or adapted by user input. This makes it possible to carry out the method in a manner that is verifiable and particularly reliable for the driver, since the correct classification function can be verified and, if necessary, adapted by interaction with the driver.
Preferably, the method further comprises the steps of: an L-shaped object in the vehicle surroundings is identified. Here, the identified L-shaped object is recognized as a static object. An object having two at least substantially straight faces which intersect at right angles or at an angle of preferably between 20 ° and 160 °, is to be regarded as an L-shaped object here, in particular in that the individual faces of these faces are arranged vertically. Such an L-shaped object refers to a structural construction such as a wall or a house, for example. By identifying such an L-shaped structure, it is therefore possible to identify in a particularly simple manner: in which regions of the surroundings static objects, i.e. non-movable objects, are present, without complicated analysis of the imaging of these objects, for example, being necessary.
In addition, the invention provides an auxiliary system of the vehicle. The assistance system comprises an environment sensing device arranged for sensing the surroundings of the vehicle and a control apparatus arranged for implementing the described method. In particular, the surroundings sensor device is also provided here for sensing range-finding data of the vehicle. In this case, the assistance system can be provided with a particularly low and cost-effective hardware complexity, in order to be able to operate the vehicle with high accuracy and high comfort for the driver.
Preferably, the environment sensing means of the auxiliary system comprise radar sensors and/or lidar sensors for sensing the surroundings. It is particularly advantageous to use radar sensors having a medium effective distance, for example in the range of up to a maximum of 50m, preferably up to a maximum of 25m, in particular at least 5 m. In this case, a surroundings map is generated on the basis of the environmental data sensed by means of the environmental sensor device and the vehicle is preferably also positioned in the surroundings map.
Particularly preferably, the environment sensor device of the auxiliary system also has a camera for sensing the surroundings and for generating a camera image of the surroundings. Preferably, the camera is a proximity camera, in particular a proximity camera arranged for sensing the surroundings in an area between 1m and 20m from the vehicle.
Preferably, the assistance system further comprises an analytical evaluation unit and/or an input device. The evaluation unit is provided for automatically classifying objects sensed in the surroundings of the vehicle. For example, the evaluation unit may be part of a control device of the vehicle or alternatively may be provided in addition to the control device. Alternatively or additionally, the classification can be carried out and/or confirmed by means of an input device, in particular by means of a manual input by the vehicle user.
Drawings
The present invention will be described with reference to the following examples, which are provided in conjunction with the accompanying drawings. In the figures, functionally identical components are designated by the same reference numerals, respectively. Shown here are:
FIG. 1 is a schematic simplified diagram of the operation of a vehicle having an auxiliary system in accordance with a preferred embodiment of the present invention, an
Fig. 2 shows a further schematic representation of the operation of the vehicle of fig. 1.
Detailed Description
Fig. 1 and 2 show a schematic, simplified diagram of a vehicle 10 with an assistance system 50. Here, a number of different simplified diagrams of a method for the operation of the assistance system 50 of the vehicle 10 are shown.
The assistance system 50 comprises an environment sensing device 52 arranged for sensing the surroundings U of the vehicle 10. The environment sensing means 52 comprise a radar sensor, a lidar sensor and a camera. By means of the environment sensing device 52, objects 4, for example in the form of obstacles, in the surroundings U of the vehicle 10 can be sensed, as is schematically shown in fig. 1(a), for example.
In addition, the assistance system 50 comprises a control device 51 arranged for generating a surroundings map 2, which is an image of the surroundings U, based on the sensed surroundings U, as shown in fig. 1 (b).
The surroundings map 2 is divided into a plurality of cells 3. The surroundings map 2 is two-dimensional and is formed in the plane of the roadway surface, on which the vehicle 10 can travel in the surroundings U. The cells 3 are square and have a side length of 1 m. The entire surroundings map 2 is likewise square and has ten parallel cells 3 in length and width, respectively, i.e. the surroundings map 2 reflects an area with 10m side lengths.
Based on the sensed object 4, each of the following cells 3 of the surroundings map 2 is assigned a predefined value: the cells are located at a location of the surroundings map 2 corresponding to the location of the surroundings U with the object 4. For example, a value representing the status "occupied" is assigned to the corresponding cell 3a representing the following place in the surrounding environment map 2: there is an object 4 at the site in the surrounding environment U (see fig. 1 (b)).
The vehicle 10 in the surroundings U can be located by means of the surroundings map 2. For this purpose, a second ambient map 2b is generated at a further point in time and compared with at least one further first ambient map 2 a. This is shown in simplified form in fig. 2 (c).
In this case, the vehicle 10 is first driven through the region of the surroundings U during a first training run, for example, under the manual control of the driver. During this training run, a first surroundings map 2a is generated. This is shown in fig. 2(a) and (b).
In a second step, a second surroundings map 2b can be generated, for example, when the vehicle 10 is again located in the same region of the surroundings U at a later point in time. In each of the two surroundings maps 2a, 2b, all cells 3 are assigned a corresponding value in order to represent an object 4 or a free area in the surroundings U.
Next, in a further step, the two surroundings maps 2a, 2b are compared with each other. This is schematically illustrated in fig. 2 (c). In this case, the two surroundings maps 2a, 2b can be superimposed on the basis of those cells 3 which represent the specific object 4b, and the vehicle 10 in the surroundings U can thus be positioned with particularly high accuracy on the basis of the two surroundings maps 2a, 2b during the second journey.
In the surrounding environment in which the vehicle 10 is moving, changes in the surrounding environment U may often occur over time. In the surroundings U, there may be movable objects 4b, such as other traffic participants, for example pedestrians and moving or parked vehicles (see fig. 2 (a)). Such a movable object 4b may be located at different points in time, for example, when creating the surroundings maps 2a, 2b, or may no longer be arranged within the same area of the surroundings U, for example, in the second sensing of fig. 2b of the second surroundings. In order to enable a simple and reliable function of the assistance system 50 even in such circumstances in which the surroundings are variable, the method for operating the assistance system 50 described below is implemented.
In this case, the objects 4 sensed in the surroundings U of the vehicle 10 are classified by means of the surroundings sensing device 52. The classification is performed by means of camera imaging sensed by the camera of the environment sensor device 52. Alternatively or additionally, the classification can also be carried out as a function of environmental data recorded by means of radar sensors and/or by means of lidar sensors. For this purpose, the camera images are analyzed by means of an evaluation unit 53 of the assistance system 50. Here, the sensed objects 4 are classified into a static object 4a and a movable object 4 b. For example, the pedestrian and the parked vehicle exemplarily shown in fig. 2(a) are identified and classified as the movable object 4b, respectively. In addition, the walls bounding the parking space in fig. 2(a) are identified and correspondingly classified as static objects 4 a.
In this case, the classification can be carried out in a particularly simple manner in such a way that L-shaped objects are searched for in the surroundings U and the recognized L-shaped objects are recognized as static objects 4 b. For example, the wall in fig. 2(a) can be easily identified as a static object 4b, depending on the fact that the side faces 41 intersect L-shaped, in this case at right angles.
After the classification of the object 4, the comparison of the two ambient maps 2a, 2b is carried out in such a way that only those cells representing the static object 4a are considered. To simplify this process, the region 40b of the surroundings map 2a, 2b representing the movable object 4b is neutralized (see fig. 2 (b)). This means that those cells 3b representing movable objects 4b are assigned a value indicating the state "unoccupied" or "no object". The respective cell 3b is thereby excluded from the comparison of the surroundings maps 2a, 2b and therefore also from the subsequent localization of the vehicle 10.
Here, the neutralization of the specific cell 3b representing the movable object 4b provides the following advantages: errors in the comparison and/or localization, for example, due to objects 4 that have moved slightly between the two points in time at which the corresponding ambient maps 2a, 2b were generated, can be avoided. In particular, only the area of the surroundings maps 2a, 2b that is always located at the same point within the surroundings U is therefore left. The surroundings maps 2a, 2b can therefore be provided in a simple manner and with particularly high accuracy, so that a highest possible accuracy of the position determination of the vehicle 10 can be achieved by means of the surroundings maps 2a, 2 b.

Claims (14)

1. Method for operating an auxiliary system (50) of a vehicle (10), the method comprising the steps of:
-sensing a surrounding environment (U) of the vehicle (10),
-generating at least two surroundings maps (2a, 2b) at different points in time, wherein each surroundings map (2a, 2b) is divided into a plurality of cells (3), wherein each cell (3) is assigned a predefined value on the basis of the presence of an object (4) at a location of the surroundings (U) corresponding to the cell (3),
-classifying objects (4) sensed in a surrounding environment (U) of the vehicle (10),
-identifying a static object (4a) by means of said classification, and
-comparing the at least two surroundings maps (2a, 2b) with each other,
wherein the comparison is performed in dependence of the corresponding cells (3) of the surroundings map (2a, 2b) representing static objects (4 a).
2. The method of claim 1, further comprising the steps of: locating the vehicle (10) in at least one of the surroundings maps (2a, 2b) on the basis of a mutual comparison of the at least two surroundings maps (2a, 2 b).
3. The method according to any of the preceding claims, further comprising the step of: prioritizing cells (3) representing static objects (4a) in the surroundings (U), wherein the comparison of the at least two surroundings maps (2a, 2b) with one another takes place by means of the prioritized cells (3).
4. The method according to any of the preceding claims, further comprising the step of:
-identifying a movable object (4b) in the surroundings (U) of the vehicle (10) by means of the classification, wherein those cells (4b) of the surroundings map (2a, 2b) representing a movable object (4b) are excluded from the comparison of the at least two surroundings maps (2a, 2 b).
5. The method of claim 4, further comprising the steps of:
-neutralizing those cells (3) of the surroundings map (2a, 2b) representing a movable object (4 b).
6. The method according to claim 4 or 5, further comprising the steps of:
-identifying traffic participants in the surroundings (U) of the vehicle (10) by means of the classification, wherein the traffic participants are recognized as movable objects (4 b).
7. The method according to any one of the preceding claims, wherein the sensing of the surroundings (U) of the vehicle (10) comprises generating camera imaging by means of a camera, and wherein the classifying is performed by means of an analytical evaluation of the camera imaging.
8. The method according to any one of the preceding claims, wherein the classification is carried out automatically by means of an analytical evaluation unit (53).
9. Method according to any of the preceding claims, wherein the classification can be implemented and/or confirmed manually by means of a user input of a user of the vehicle (10).
10. The method according to any of the preceding claims, further comprising the step of:
-identifying L-shaped objects in the surroundings (U) of the vehicle (10), wherein the identified L-shaped objects are recognized as static objects (4 a).
11. An assistance system of a vehicle (10), the assistance system comprising:
-an environment sensing device (52) for sensing the surroundings (U) of the vehicle (10), and
-a control device (51) arranged for implementing the method according to any one of the preceding claims.
12. Auxiliary system according to claim 11, wherein the environment sensing device (52) has a radar sensor and/or a lidar sensor for sensing the surroundings (U).
13. Auxiliary system according to claim 11 or 12, wherein the environment sensing device (52) has a camera for sensing the surroundings (U) and for producing camera imaging.
14. The assistance system according to any one of claims 11 to 13, further comprising an analytical evaluation unit (53) and/or an input device (54) to classify objects (4) sensed in the surroundings (U) of the vehicle (10).
CN202110918739.0A 2020-08-11 2021-08-11 Method for operating an assistance system of a vehicle and assistance system of a vehicle Pending CN114074665A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020210116.7A DE102020210116A1 (en) 2020-08-11 2020-08-11 Method for operating an assistance system of a vehicle
DE102020210116.7 2020-08-11

Publications (1)

Publication Number Publication Date
CN114074665A true CN114074665A (en) 2022-02-22

Family

ID=80000622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110918739.0A Pending CN114074665A (en) 2020-08-11 2021-08-11 Method for operating an assistance system of a vehicle and assistance system of a vehicle

Country Status (2)

Country Link
CN (1) CN114074665A (en)
DE (1) DE102020210116A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014221888A1 (en) 2014-10-28 2016-04-28 Robert Bosch Gmbh Method and device for locating a vehicle in its environment
DE102014223363B4 (en) 2014-11-17 2021-04-29 Volkswagen Aktiengesellschaft Method and device for localizing a motor vehicle in a fixed reference map
DE102017201664A1 (en) 2017-02-02 2018-08-02 Robert Bosch Gmbh Method for locating a higher automated vehicle in a digital map
DE102017221691A1 (en) 2017-12-01 2019-06-06 Volkswagen Aktiengesellschaft Method and device for self-localization of a vehicle
DE102018133457B4 (en) 2018-12-21 2020-07-09 Volkswagen Aktiengesellschaft Method and system for providing environmental data

Also Published As

Publication number Publication date
DE102020210116A1 (en) 2022-02-17

Similar Documents

Publication Publication Date Title
US10528055B2 (en) Road sign recognition
US10949684B2 (en) Vehicle image verification
CN109522784B (en) Device and method for distinguishing between surmountable and non-surmountable objects
JP6800575B2 (en) Methods and systems to assist drivers in their own vehicles
US20200049511A1 (en) Sensor fusion
EP3032454B1 (en) Method and system for adaptive ray based scene analysis of semantic traffic spaces and vehicle equipped with such system
US11460851B2 (en) Eccentricity image fusion
CN103781685B (en) The autonomous drive-control system of vehicle
CN110857085A (en) Vehicle path planning
US20160139255A1 (en) Method and device for the localization of a vehicle from a fixed reference map
CN111382768A (en) Multi-sensor data fusion method and device
CN107406073B (en) Method and device for monitoring a target trajectory to be covered by a vehicle in terms of collision-free behavior
CN109814130B (en) System and method for free space inference to separate clustered objects in a vehicle awareness system
US20200020117A1 (en) Pose estimation
CN113343746B (en) Method and device for lane detection on a vehicle driving surface
CN109814125A (en) System and method for determining the speed of laser radar point
US11188085B2 (en) Vehicle capsule networks
US11829131B2 (en) Vehicle neural network enhancement
US20230237783A1 (en) Sensor fusion
Valldorf et al. Advanced Microsystems for Automotive Applications 2007
CN116703966A (en) Multi-object tracking
US20230394959A1 (en) Method for providing information about road users
US20220237889A1 (en) Analysis of dynamic spatial scenarios
CN113815627A (en) Method and system for determining a command of a vehicle occupant
US11087147B2 (en) Vehicle lane mapping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination