US20220414151A1 - Method for Operating a Surroundings Sensing Device with Grid-Based Evaluation and with Fusion, and Surroundings Sensing Device - Google Patents

Method for Operating a Surroundings Sensing Device with Grid-Based Evaluation and with Fusion, and Surroundings Sensing Device Download PDF

Info

Publication number
US20220414151A1
US20220414151A1 US17/780,272 US202017780272A US2022414151A1 US 20220414151 A1 US20220414151 A1 US 20220414151A1 US 202017780272 A US202017780272 A US 202017780272A US 2022414151 A1 US2022414151 A1 US 2022414151A1
Authority
US
United States
Prior art keywords
surroundings
grid
sensing device
distance range
motor vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/780,272
Inventor
Sascha Steyer
Georg Tanzmeister
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Assigned to BAYERISCHE MOTOREN WERKE AKTIENGESELLSCHAFT reassignment BAYERISCHE MOTOREN WERKE AKTIENGESELLSCHAFT ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Steyer, Sascha, TANZMEISTER, GEORG
Publication of US20220414151A1 publication Critical patent/US20220414151A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • G06K9/6289

Definitions

  • the invention relates to a method for operating a surroundings sensing device for a motor vehicle for sensing surroundings of the motor vehicle. Furthermore, the invention relates to a surroundings sensing device.
  • static grids are already known for recognizing static obstacles
  • a high-level object fusion is already known for recognizing and tracking dynamic objects, such as vehicles, trucks, pedestrians, or further things.
  • a plurality of different surroundings sensors is typically installed in the vehicle.
  • object recognition and tracking of the object which can also be referred to as tracking, are first carried out for the sensor data of each individual surroundings sensor and subsequently these tracked object lists are fused.
  • DE 10 2014 014 295 A1 discloses a method for monitoring a calibration of a plurality of sensor data from surroundings sensors, which record the surroundings of a motor vehicle and are installed at installation positions in the motor vehicle described by extrinsic calibration parameters, with respect to the extrinsic calibration parameters, wherein to ascertain a decalibration of at least one surroundings sensor, the same feature of the surroundings is evaluated in sensor data of different surroundings sensors describing the same properties by at least one decalibration criterion comparing the sensor data.
  • DE 10 2009 006 113 A1 relates to a device and a method for providing a surroundings representation of a vehicle having at least one first sensor unit, at least one second sensor unit, and an evaluation unit, wherein the sensor unit provides items of information about objects recognized in surroundings of the vehicle in the form of sensor objects, wherein a sensor object represents an object recognized by the respective sensor unit, and the sensor objects comprise as an attribute at least one existence probability of the represented object and the sensor objects recognized by the at least one first sensor unit and by the at least one second sensor unit are subjected to an object fusion, in which fusion objects are generated, to which at least one existence probability is assigned as an attribute, wherein the existence probabilities of the fusion objects are fused based on the existence probabilities of the sensor objects, wherein the fusion of the existence probability of one of the sensor objects takes place in each case in dependence on the respective sensor unit by which the corresponding sensor object is provided.
  • the object of the present invention is to provide a method and a surroundings sensing device by way of which the surroundings of the motor vehicle can be sensed in an improved manner.
  • This object is achieved by a method and a surroundings sensing device according to the claimed invention.
  • One aspect of the invention relates to a method for operating a surroundings sensing device for a motor vehicle for sensing surroundings of the motor vehicle.
  • the surroundings are sensed using at least one first surroundings sensor of the surroundings sensing device and using at least one second surroundings sensor of the surroundings sensing device.
  • the surroundings sensed by way of the first surroundings sensor and the surroundings sensed by way of the second surroundings sensor are transferred to an electronic computing unit of the surroundings sensing device.
  • a grid-based evaluation of the transferred, sensed surroundings is carried out for a first distance range of the surroundings by way of the electronic computing unit.
  • a fusion is carried out of the transferred, sensed surroundings for a second distance range different from the first distance range by way of the electronic computing unit.
  • An evaluation of the surroundings is carried out as a function of the grid-based evaluation and the fusion by way of the electronic computing unit.
  • an embodiment of the invention thus solves the problem that the grid-based evaluation, which can also be referred to in particular as grid-based, is very robust, but also requires a higher computing effort. From a functional aspect, the highest possible range is desired with high accuracy, but this would also mean the greatest computing effort. It is thus provided according to an embodiment of the invention that the grid-based evaluation is combined with the fusion, which can also be referred to as high-level object fusion. In particular, a grid-based evaluation with high computing effort is thus carried out in the first distance range, while the high-level object fusion is carried out in the second distance range.
  • the objects can be both static objects and also dynamic objects.
  • an ultrasonic sensor and/or a camera is used, for example, for the first distance range.
  • a radar sensor and/or a lidar sensor can be used as surroundings sensors.
  • the first distance range is provided at a lesser distance to the motor vehicle than the second distance range.
  • the grid-based evaluation takes place in particular in the close motor vehicle surroundings, while the fusion is carried out in the more remote motor vehicle surroundings. It is thus made possible that a high-resolution is provided for the first distance range, in other words for the close range, by which objects can be determined reliably and precisely.
  • the second distance range in particular a long range can be achieved by way of the fusion, so that very remote objects can also be recognized and tracked.
  • a dynamic grid is generated for the grid-based evaluation.
  • the dynamic grid can also be referred to in particular as a dynamic occupancy grid.
  • This dynamic grid can start from an expansion of a static grid.
  • grid-based object tracking also takes place in the dynamic grid.
  • objects in the surroundings can be tracked or sensed, which can be both static and also dynamic.
  • this can advantageously be carried out by way of the dynamic grid in dense surroundings and in the case of atypical objects which have not yet previously been observed.
  • the surroundings are in particular divided into cells and a number of attributes is estimated per cell.
  • a first factor is the range or the distance range and a second factor is the desired accuracy. It can be provided in particular here that the smaller a cell is, the more cells are required for the same distance range.
  • velocities and items of dynamic evidence per cell are also stored.
  • the first distance range is evaluated decentered from the motor vehicle.
  • the first distance range is laid at least essentially in a circular or elliptical manner around the motor vehicle.
  • the sensing of a front region of the motor vehicle is more important than the sensing of a rear region of the motor vehicle.
  • this elliptical distance range is decentered, for example shifted farther in the direction of the front of the motor vehicle, so that the resolution in the decentered region is higher toward the front than in the rear region.
  • the dynamic grid is typically formed to the front and to the side having a greater perspective and to the rear having a significantly lesser perspective.
  • the resolution of the dynamic grid corresponds in particular to the desired accuracy of the static obstacles.
  • the dynamic grid in particular covers the region in which a high resolution of dynamic objects is required.
  • At least the first distance range is decentered as a function of a determined position of the motor vehicle and/or as a function of a velocity of the motor vehicle. If, for example, the motor vehicle should be in urban surroundings, it may thus be provided that only the first distance range is shifted slightly forwards, so that the surroundings behind the motor vehicle and laterally to the motor vehicle can also be reliably sensed. If, for example, the motor vehicle should be underway on a freeway, in particular the front region in front of the motor vehicle is of greater importance, so that here in particular the first distance range is shifted forward. Furthermore, the shift of the dynamic grid can also be carried out depending on velocity. It is thus made possible that a corresponding shift of the grid can be carried out in dependence on the velocity and/or position.
  • the position of the motor vehicle in the grid can also change over time and thus the relationship between coverage to the front and to the rear.
  • the motor vehicle In urban surroundings, the motor vehicle will be centralized to have coverage at equal range in all directions, and the perspective forward can be greater on a freeway, for example.
  • an object list having recognized objects is generated in each case upon the grid-based evaluation and upon the fusion, and these object lists are evaluated by way of the electronic computing unit.
  • an object list is created individually by each surroundings sensor, which are then fused together upon the fusion by the electronic computing unit.
  • an object list is also generated on the basis of the dynamic grid. Tracking of the objects can thus be carried out in an improved manner in particular.
  • an association between the object lists of the grid-based evaluation and the object lists of the fusion is carried out in a transition range between the first distance range and the second distance range.
  • the ranges outside the grid are covered by the fusion, in other words the high-level fusion.
  • the transition range between the dynamic grid and the high-level fusion there is an association and a fusion or combination between the two object lists. It is thus made possible that both objects which move from close range into long range can be reliably transferred upon the evaluation and objects which move from long range into close range can also be reliably transferred. Improved operation of the surroundings sensing device is thus implemented.
  • a cell size is adapted in the grid-based evaluation as a function of a determined position of the motor vehicle and/or as a function of a velocity of the motor vehicle.
  • the resolution and the cell size of the grid remain the same.
  • the cell size can change over time at equal resolution. For example, small cells can be implemented at shorter range in parking areas and larger cells at a longer range, for example, on a freeway. The surroundings can thus be detected in an improved manner.
  • a resolution is predefined to be constant in the grid-based evaluation.
  • the resolution remains the same, but the cell size can change.
  • a further aspect of the invention relates to a surroundings sensing device for a motor vehicle for sensing surroundings having at least two surroundings sensors and having at least one electronic computing unit, wherein the surroundings sensing device is designed to carry out a method according to the preceding aspect. In particular, the method is carried out by way of the surroundings sensing device.
  • Still a further aspect of the invention relates to a motor vehicle having a surroundings sensing device.
  • the motor vehicle is designed in particular as a passenger vehicle.
  • Advantageous embodiments of the method are considered to be advantageous embodiments of the surroundings sensing device and the motor vehicle.
  • the surroundings sensing device and the motor vehicle have features of the subject matter for this purpose, which enable the method and an advantageous embodiment thereof to be carried out.
  • FIG. 1 shows a schematic top view of a motor vehicle having an embodiment of a surroundings sensing device.
  • FIG. 1 identical or functionally identical elements are provided with the same reference signs.
  • FIG. 1 shows a schematic top view of a motor vehicle 10 having an embodiment of a surroundings sensing device 12 .
  • the surroundings sensing device 12 has at least one first surroundings sensor 14 and a second surroundings sensor 18 .
  • the surroundings sensing device 12 has an electronic computing unit 20 .
  • the surroundings sensing device 12 is designed for the motor vehicle 10 to sense surroundings 22 of the motor vehicle 10 .
  • the surroundings 22 are sensed at least using the first surroundings sensor 14 and the second surroundings sensor 18 .
  • the surroundings 22 sensed by way of the first surroundings sensor 14 and the surroundings 22 sensed by way of the second surroundings sensor 18 are transferred to the electronic computing unit 20 .
  • a grid-based evaluation of the transferred, sensed surroundings 22 takes place for a first distance range 26 by way of the electronic computing unit 20
  • a fusion 28 of the transferred detected surroundings 22 takes place for a second distance range 30 , which is different from the first distance range 26 , by way of the electronic computing unit 20 .
  • An evaluation of the surroundings 22 is carried out as a function of the grid-based evaluation 24 and the fusion 28 by way of the electronic computing unit 20 .
  • a first object 16 is located in the first distance range 26 .
  • a second object 36 is located in the second distance range.
  • the objects 16 , 36 can be sensed by way of the surroundings sensing device 12 .
  • the first distance range 26 is provided at a lesser distance A to the motor vehicle 10 than the second distance range 30 .
  • a dynamic grid is generated for the grid-based evaluation 24 .
  • FIG. 1 shows that the first distance range 26 is evaluated decentered to the motor vehicle 10 .
  • at least the first distance range 26 can be decentered for this purpose as a function of a determined position of the motor vehicle 10 and/or as a function of a velocity of the motor vehicle 10 .
  • an object list having recognized objects 16 , 36 in the surroundings 22 is generated in each case in the grid-based evaluation 24 and in the fusion 28 and evaluation is carried out in these object lists by way of the electronic computing unit 20 .
  • an association between the object lists of the grid-based evaluation 24 and the object lists of the fusion 28 is carried out in a transition range 32 between the first distance range 26 and the second distance range 30 .
  • a cell size 34 is adapted in the grid-based evaluation 24 as a function of a determined position of the motor vehicle 10 and/or as a function of a velocity of the motor vehicle 10 .
  • the resolution is predefined to be constant in the grid-based evaluation 24 .
  • the embodiment of the invention shown in FIG. 1 thus solves the problem that the grid-based evaluation 24 is very computing intensive.
  • the grid-based evaluation 24 has a high resolution.
  • the grid-based evaluation 24 is carried out in the first distance range 26 and the fusion 28 is carried out in the second distance range 30 .
  • the fusion 28 is in particular a high-level object fusion.
  • objects 16 , 36 can be sensed by way of the surroundings sensing device 12 .
  • the objects 16 , 36 can be both static and also dynamic.
  • the object lists of the high-level object fusion be combined with the object lists from the dynamic grid, including grid-based object tracking function.
  • the dynamic grid is at least as large with respect to size, accuracy in the first distance range 26 as the range in which relevant static obstacles, in other words static objects 16 , 36 , are. This is typically the case with greater perspective to the front and to the side and with significantly lesser distance A to the rear.
  • the resolution corresponds to the desired accuracy of the static objects 16 , 36 .
  • the size of the dynamic grid covers the range in which a high resolution of dynamic objects 16 , 36 is required. The ranges outside the grid are covered by the high-level fusion, in other words the fusion 28 .
  • the transition range 32 between dynamic grid and the high-level fusion there is an association of the fusion 28 and a combination of the two object lists.
  • the dynamic grid having fixed resolution and cell size 34 in one time step is proposed, the cell size 34 can certainly change over time at equal resolution.
  • small cells at shorter range can be implemented in parking areas and larger cells can be specified at a high range on freeways.
  • the position of the motor vehicle 10 in relation to the grid also changes over time and thus the relationship between coverage to the front and to the rear also changes.
  • the motor vehicle 10 can be centered in order to have a coverage to equal distances in all directions, while the perspective to the front is greater on a freeway, for example, and thus decentering takes place.
  • an embodiment of the invention discloses a method for recognizing objects 16 , 36 and obstacles for long ranges and high accuracies.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A method for operating a surroundings sensing device for a motor vehicle includes sensing the surroundings by at least one first surroundings sensor of the surroundings sensing device and by at least one second surroundings sensor of the surroundings sensing device; transmitting the surroundings sensed by the first surroundings sensor and the surroundings sensed by the second surroundings sensor to an electronic computing unit of the surroundings sensing device; performing grid-based evaluation of the surroundings for a first distance range of the surroundings by means of the electronic computing unit; performing fusion of the surroundings for a second distance range, which is different from the first distance range, by the electronic computing unit; and evaluating the surroundings depending on the grid-based evaluation and the fusion.

Description

    BACKGROUND AND SUMMARY OF THE INVENTION
  • The invention relates to a method for operating a surroundings sensing device for a motor vehicle for sensing surroundings of the motor vehicle. Furthermore, the invention relates to a surroundings sensing device.
  • In the field of driver assistance and automated driving functions, static grids are already known for recognizing static obstacles, and a high-level object fusion is already known for recognizing and tracking dynamic objects, such as vehicles, trucks, pedestrians, or further things. A plurality of different surroundings sensors is typically installed in the vehicle. In high-level object fusion, object recognition and tracking of the object, which can also be referred to as tracking, are first carried out for the sensor data of each individual surroundings sensor and subsequently these tracked object lists are fused.
  • DE 10 2014 014 295 A1 discloses a method for monitoring a calibration of a plurality of sensor data from surroundings sensors, which record the surroundings of a motor vehicle and are installed at installation positions in the motor vehicle described by extrinsic calibration parameters, with respect to the extrinsic calibration parameters, wherein to ascertain a decalibration of at least one surroundings sensor, the same feature of the surroundings is evaluated in sensor data of different surroundings sensors describing the same properties by at least one decalibration criterion comparing the sensor data.
  • Furthermore, DE 10 2009 006 113 A1 relates to a device and a method for providing a surroundings representation of a vehicle having at least one first sensor unit, at least one second sensor unit, and an evaluation unit, wherein the sensor unit provides items of information about objects recognized in surroundings of the vehicle in the form of sensor objects, wherein a sensor object represents an object recognized by the respective sensor unit, and the sensor objects comprise as an attribute at least one existence probability of the represented object and the sensor objects recognized by the at least one first sensor unit and by the at least one second sensor unit are subjected to an object fusion, in which fusion objects are generated, to which at least one existence probability is assigned as an attribute, wherein the existence probabilities of the fusion objects are fused based on the existence probabilities of the sensor objects, wherein the fusion of the existence probability of one of the sensor objects takes place in each case in dependence on the respective sensor unit by which the corresponding sensor object is provided.
  • The object of the present invention is to provide a method and a surroundings sensing device by way of which the surroundings of the motor vehicle can be sensed in an improved manner.
  • This object is achieved by a method and a surroundings sensing device according to the claimed invention.
  • One aspect of the invention relates to a method for operating a surroundings sensing device for a motor vehicle for sensing surroundings of the motor vehicle. The surroundings are sensed using at least one first surroundings sensor of the surroundings sensing device and using at least one second surroundings sensor of the surroundings sensing device. The surroundings sensed by way of the first surroundings sensor and the surroundings sensed by way of the second surroundings sensor are transferred to an electronic computing unit of the surroundings sensing device. A grid-based evaluation of the transferred, sensed surroundings is carried out for a first distance range of the surroundings by way of the electronic computing unit. A fusion is carried out of the transferred, sensed surroundings for a second distance range different from the first distance range by way of the electronic computing unit. An evaluation of the surroundings is carried out as a function of the grid-based evaluation and the fusion by way of the electronic computing unit.
  • It is thus made possible that the surroundings can be sensed in an improved manner.
  • In particular, an embodiment of the invention thus solves the problem that the grid-based evaluation, which can also be referred to in particular as grid-based, is very robust, but also requires a higher computing effort. From a functional aspect, the highest possible range is desired with high accuracy, but this would also mean the greatest computing effort. It is thus provided according to an embodiment of the invention that the grid-based evaluation is combined with the fusion, which can also be referred to as high-level object fusion. In particular, a grid-based evaluation with high computing effort is thus carried out in the first distance range, while the high-level object fusion is carried out in the second distance range.
  • In other words, it is proposed in an embodiment of this invention, to manage the conflicting demands that corresponding object lists from the high-level object fusion be combined with the object lists from the dynamic grid including object tracking. The advantages of both sensing methods can thus be used.
  • In particular, it is possible using the surroundings sensing device to sense and track objects in the surroundings, in other words to track them. The objects can be both static objects and also dynamic objects.
  • In the surroundings sensors, it can be provided in particular that an ultrasonic sensor and/or a camera is used, for example, for the first distance range. For the second distance range, for example, a radar sensor and/or a lidar sensor can be used as surroundings sensors.
  • According to one advantageous embodiment, the first distance range is provided at a lesser distance to the motor vehicle than the second distance range. In other words, the grid-based evaluation takes place in particular in the close motor vehicle surroundings, while the fusion is carried out in the more remote motor vehicle surroundings. It is thus made possible that a high-resolution is provided for the first distance range, in other words for the close range, by which objects can be determined reliably and precisely. In the second distance range, in particular a long range can be achieved by way of the fusion, so that very remote objects can also be recognized and tracked.
  • It is furthermore advantageous if a dynamic grid is generated for the grid-based evaluation. The dynamic grid can also be referred to in particular as a dynamic occupancy grid. This dynamic grid can start from an expansion of a static grid. In particular, grid-based object tracking also takes place in the dynamic grid. In particular, it is thus made possible that objects in the surroundings can be tracked or sensed, which can be both static and also dynamic. In particular, this can advantageously be carried out by way of the dynamic grid in dense surroundings and in the case of atypical objects which have not yet previously been observed. In the grids, the surroundings are in particular divided into cells and a number of attributes is estimated per cell. These grids are very robust, but require a high computing effort. The required number of the corresponding cells of the grid is dependent in particular on two factors. In particular, a first factor is the range or the distance range and a second factor is the desired accuracy. It can be provided in particular here that the smaller a cell is, the more cells are required for the same distance range. In the dynamic grids, in contrast to the static grids, velocities and items of dynamic evidence per cell are also stored.
  • It is furthermore advantageous if the first distance range is evaluated decentered from the motor vehicle. For example, it can be provided that the first distance range is laid at least essentially in a circular or elliptical manner around the motor vehicle. However, in particular in certain situations, the sensing of a front region of the motor vehicle is more important than the sensing of a rear region of the motor vehicle. In particular, it can then be provided that this elliptical distance range is decentered, for example shifted farther in the direction of the front of the motor vehicle, so that the resolution in the decentered region is higher toward the front than in the rear region. The dynamic grid is typically formed to the front and to the side having a greater perspective and to the rear having a significantly lesser perspective. The resolution of the dynamic grid corresponds in particular to the desired accuracy of the static obstacles. Furthermore, the dynamic grid in particular covers the region in which a high resolution of dynamic objects is required.
  • In a further advantageous embodiment, at least the first distance range is decentered as a function of a determined position of the motor vehicle and/or as a function of a velocity of the motor vehicle. If, for example, the motor vehicle should be in urban surroundings, it may thus be provided that only the first distance range is shifted slightly forwards, so that the surroundings behind the motor vehicle and laterally to the motor vehicle can also be reliably sensed. If, for example, the motor vehicle should be underway on a freeway, in particular the front region in front of the motor vehicle is of greater importance, so that here in particular the first distance range is shifted forward. Furthermore, the shift of the dynamic grid can also be carried out depending on velocity. It is thus made possible that a corresponding shift of the grid can be carried out in dependence on the velocity and/or position.
  • In particular, the position of the motor vehicle in the grid can also change over time and thus the relationship between coverage to the front and to the rear. In urban surroundings, the motor vehicle will be centralized to have coverage at equal range in all directions, and the perspective forward can be greater on a freeway, for example.
  • Furthermore, it is advantageous if an object list having recognized objects is generated in each case upon the grid-based evaluation and upon the fusion, and these object lists are evaluated by way of the electronic computing unit. In particular, it is provided that an object list is created individually by each surroundings sensor, which are then fused together upon the fusion by the electronic computing unit. In particular, an object list is also generated on the basis of the dynamic grid. Tracking of the objects can thus be carried out in an improved manner in particular.
  • Furthermore, it has proven to be advantageous if an association between the object lists of the grid-based evaluation and the object lists of the fusion is carried out in a transition range between the first distance range and the second distance range. In particular, it is thus provided that the ranges outside the grid are covered by the fusion, in other words the high-level fusion. In the transition range between the dynamic grid and the high-level fusion, there is an association and a fusion or combination between the two object lists. It is thus made possible that both objects which move from close range into long range can be reliably transferred upon the evaluation and objects which move from long range into close range can also be reliably transferred. Improved operation of the surroundings sensing device is thus implemented.
  • It is also advantageous if a cell size is adapted in the grid-based evaluation as a function of a determined position of the motor vehicle and/or as a function of a velocity of the motor vehicle. In particular, it is proposed, for example, that in one time step, the resolution and the cell size of the grid remain the same. However, the cell size can change over time at equal resolution. For example, small cells can be implemented at shorter range in parking areas and larger cells at a longer range, for example, on a freeway. The surroundings can thus be detected in an improved manner.
  • Furthermore, it can be provided that a resolution is predefined to be constant in the grid-based evaluation. In other words, the resolution remains the same, but the cell size can change. Implementing improved object recognition in the surroundings of the motor vehicle is thus enabled in a simple manner.
  • A further aspect of the invention relates to a surroundings sensing device for a motor vehicle for sensing surroundings having at least two surroundings sensors and having at least one electronic computing unit, wherein the surroundings sensing device is designed to carry out a method according to the preceding aspect. In particular, the method is carried out by way of the surroundings sensing device.
  • Still a further aspect of the invention relates to a motor vehicle having a surroundings sensing device. The motor vehicle is designed in particular as a passenger vehicle.
  • Advantageous embodiments of the method are considered to be advantageous embodiments of the surroundings sensing device and the motor vehicle. The surroundings sensing device and the motor vehicle have features of the subject matter for this purpose, which enable the method and an advantageous embodiment thereof to be carried out.
  • Further features of the invention result from the claims, the figure, and the description of the figure. The features and combinations of features mentioned above in the description and the features and combinations of features mentioned hereinafter in the description of the figure and/or solely shown in the figure are usable not only in the respective specified combination but also in other combinations or alone.
  • The invention will now be explained in more detail on the basis of a preferred exemplary embodiment and with reference to the drawing.
  • BRIEF DESCRIPTION OF THE DRAWING
  • FIG. 1 shows a schematic top view of a motor vehicle having an embodiment of a surroundings sensing device.
  • DETAILED DESCRIPTION OF THE DRAWING
  • In FIG. 1 , identical or functionally identical elements are provided with the same reference signs.
  • FIG. 1 shows a schematic top view of a motor vehicle 10 having an embodiment of a surroundings sensing device 12. The surroundings sensing device 12 has at least one first surroundings sensor 14 and a second surroundings sensor 18. Furthermore, the surroundings sensing device 12 has an electronic computing unit 20. The surroundings sensing device 12 is designed for the motor vehicle 10 to sense surroundings 22 of the motor vehicle 10.
  • In the method for operating the surroundings sensing device 12 for the motor vehicle 10 to sense the surroundings 22 of the motor vehicle 10, the surroundings 22 are sensed at least using the first surroundings sensor 14 and the second surroundings sensor 18. The surroundings 22 sensed by way of the first surroundings sensor 14 and the surroundings 22 sensed by way of the second surroundings sensor 18 are transferred to the electronic computing unit 20. A grid-based evaluation of the transferred, sensed surroundings 22 takes place for a first distance range 26 by way of the electronic computing unit 20, and a fusion 28 of the transferred detected surroundings 22 takes place for a second distance range 30, which is different from the first distance range 26, by way of the electronic computing unit 20. An evaluation of the surroundings 22 is carried out as a function of the grid-based evaluation 24 and the fusion 28 by way of the electronic computing unit 20.
  • In particular a first object 16 is located in the first distance range 26. In particular a second object 36 is located in the second distance range. In particular, the objects 16, 36 can be sensed by way of the surroundings sensing device 12.
  • In particular, it is provided that the first distance range 26 is provided at a lesser distance A to the motor vehicle 10 than the second distance range 30.
  • In particular, a dynamic grid is generated for the grid-based evaluation 24.
  • Furthermore, FIG. 1 shows that the first distance range 26 is evaluated decentered to the motor vehicle 10. In particular, at least the first distance range 26 can be decentered for this purpose as a function of a determined position of the motor vehicle 10 and/or as a function of a velocity of the motor vehicle 10.
  • Furthermore, it is provided in particular that an object list having recognized objects 16, 36 in the surroundings 22 is generated in each case in the grid-based evaluation 24 and in the fusion 28 and evaluation is carried out in these object lists by way of the electronic computing unit 20.
  • Furthermore, it can be provided in particular that an association between the object lists of the grid-based evaluation 24 and the object lists of the fusion 28 is carried out in a transition range 32 between the first distance range 26 and the second distance range 30.
  • In particular, it can be provided that a cell size 34 is adapted in the grid-based evaluation 24 as a function of a determined position of the motor vehicle 10 and/or as a function of a velocity of the motor vehicle 10. In particular, it is also provided that the resolution is predefined to be constant in the grid-based evaluation 24.
  • In particular, the embodiment of the invention shown in FIG. 1 thus solves the problem that the grid-based evaluation 24 is very computing intensive. In particular, however, the grid-based evaluation 24 has a high resolution. To save computing capacity, the grid-based evaluation 24 is carried out in the first distance range 26 and the fusion 28 is carried out in the second distance range 30. The fusion 28 is in particular a high-level object fusion.
  • In particular, it is thus provided that objects 16, 36 can be sensed by way of the surroundings sensing device 12. The objects 16, 36 can be both static and also dynamic.
  • In other words, to manage the conflicting requirements, it is proposed that the object lists of the high-level object fusion be combined with the object lists from the dynamic grid, including grid-based object tracking function. The dynamic grid is at least as large with respect to size, accuracy in the first distance range 26 as the range in which relevant static obstacles, in other words static objects 16, 36, are. This is typically the case with greater perspective to the front and to the side and with significantly lesser distance A to the rear. The resolution corresponds to the desired accuracy of the static objects 16, 36. In addition, the size of the dynamic grid covers the range in which a high resolution of dynamic objects 16, 36 is required. The ranges outside the grid are covered by the high-level fusion, in other words the fusion 28. In the transition range 32 between dynamic grid and the high-level fusion, there is an association of the fusion 28 and a combination of the two object lists. Although the dynamic grid having fixed resolution and cell size 34 in one time step is proposed, the cell size 34 can certainly change over time at equal resolution. For example, small cells at shorter range can be implemented in parking areas and larger cells can be specified at a high range on freeways. Furthermore, it can be provided that the position of the motor vehicle 10 in relation to the grid also changes over time and thus the relationship between coverage to the front and to the rear also changes. In urban surroundings 22, the motor vehicle 10 can be centered in order to have a coverage to equal distances in all directions, while the perspective to the front is greater on a freeway, for example, and thus decentering takes place.
  • Overall, an embodiment of the invention discloses a method for recognizing objects 16, 36 and obstacles for long ranges and high accuracies.
  • LIST OF REFERENCE SIGNS
    • 10 motor vehicle
    • 12 surroundings sensing device
    • 14 first surroundings sensor
    • 16 first object
    • 18 second surroundings sensor
    • 20 electronic computing unit
    • 22 surroundings
    • 24 grid-based evaluation
    • 26 first distance range
    • 28 fusion
    • 30 second distance range
    • 32 transition range
    • 34 cell size
    • 36 object
    • A distance

Claims (11)

1.-10. (canceled)
11. A method for operating a surroundings sensing device for a motor vehicle for sensing surroundings of the motor vehicle, the method comprising:
sensing the surroundings using at least one first surroundings sensor of the surroundings sensing device and using at least one second surroundings sensor of the surroundings sensing device;
transferring the surroundings sensed by the first surroundings sensor and the surroundings sensed by the second surroundings sensor to an electronic computing unit of the surroundings sensing device;
performing grid-based evaluation of the surroundings for a first distance range of the surroundings by the electronic computing unit;
performing fusion of the surroundings for a second distance range different from the first distance range by the electronic computing unit; and
evaluating the surroundings as a function of the grid-based evaluation and the fusion by the electronic computing unit.
12. The method according to claim 11, wherein:
the first distance range is provided at a lesser distance to the motor vehicle than the second distance range.
13. The method according to claim 11, wherein:
a dynamic grid is generated for the grid-based evaluation.
14. The method according to claim 11, wherein:
the first distance range is evaluated decentered in relation to the motor vehicle.
15. The method according to claim 14, wherein:
at least the first distance range is decentered as a function of at least one of a determined position of the motor vehicle or a velocity of the motor vehicle.
16. The method according to claim 11, wherein:
an object list having recognized objects in the surroundings is generated in each of the grid-based evaluation and in the fusion, and the object lists are evaluated by the electronic computing unit.
17. The method according to claim 16, wherein:
an association between the object lists of the grid-based evaluation and the object lists of the fusion is carried out in a transition range between the first distance range and the second distance range.
18. The method according to claim 11, wherein:
a cell size is adapted in the grid-based evaluation as a function of at least one of a determined position of the motor vehicle or a velocity of the motor vehicle.
19. The method according to claim 11, wherein:
a resolution is specified to be constant in the grid-based evaluation.
20. A surroundings sensing device for a motor vehicle for sensing surroundings, the surrounding sensing device comprising:
at least two surroundings sensors;
at least one electronic computing unit; and
a processor configured to carry out a method comprising:
sensing the surroundings using at least one first surroundings sensor of the surroundings sensing device and using at least one second surroundings sensor of the surroundings sensing device;
transferring the surroundings sensed by the first surroundings sensor and the surroundings sensed by the second surroundings sensor to an electronic computing unit of the surroundings sensing device;
performing grid-based evaluation of the surroundings for a first distance range of the surroundings by the electronic computing unit;
performing fusion of the surroundings for a second distance range different from the first distance range by the electronic computing unit; and
evaluating the surroundings as a function of the grid-based evaluation and the fusion by the electronic computing unit.
US17/780,272 2019-11-28 2020-11-03 Method for Operating a Surroundings Sensing Device with Grid-Based Evaluation and with Fusion, and Surroundings Sensing Device Pending US20220414151A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102019132363.0 2019-11-28
DE102019132363.0A DE102019132363A1 (en) 2019-11-28 2019-11-28 Method for operating an environment detection device with a grid-based evaluation and with a merger, as well as environment detection device
PCT/EP2020/080782 WO2021104805A1 (en) 2019-11-28 2020-11-03 Method for operating a surroundings sensing device with grid-based evaluation and with fusion, and surroundings sensing device

Publications (1)

Publication Number Publication Date
US20220414151A1 true US20220414151A1 (en) 2022-12-29

Family

ID=73059914

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/780,272 Pending US20220414151A1 (en) 2019-11-28 2020-11-03 Method for Operating a Surroundings Sensing Device with Grid-Based Evaluation and with Fusion, and Surroundings Sensing Device

Country Status (4)

Country Link
US (1) US20220414151A1 (en)
CN (1) CN114730495A (en)
DE (1) DE102019132363A1 (en)
WO (1) WO2021104805A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220185294A1 (en) * 2020-11-20 2022-06-16 Commissariat A L'energie Atomique Et Aux Energies Alternatives Iterative method for estimating the movement of a material body by generating a filtered movement grid

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100217439A1 (en) * 2009-02-23 2010-08-26 Samsung Electronics Co., Ltd. Map building apparatus and method
US20150266475A1 (en) * 2014-03-20 2015-09-24 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Establishing a Trajectory for a Vehicle
US20160137207A1 (en) * 2013-07-26 2016-05-19 Bayerische Motoren Werke Aktiengesellschaft Method and Apparatus For Efficiently Providing Occupancy Information on the Surroundings of a Vehicle
US20160171893A1 (en) * 2014-12-16 2016-06-16 Here Global B.V. Learning Lanes From Radar Data
US20160179094A1 (en) * 2014-12-17 2016-06-23 Bayerische Motoren Werke Aktiengesellschaft Communication Between a Vehicle and a Road User in the Surroundings of a Vehicle
WO2017097786A2 (en) * 2015-12-07 2017-06-15 Valeo Schalter Und Sensoren Gmbh Driver-assistance method and device
US20170247036A1 (en) * 2016-02-29 2017-08-31 Faraday&Future Inc. Vehicle sensing grid having dynamic sensing cell size
US20170334353A1 (en) * 2015-02-09 2017-11-23 Applications Solutions (Electronic and Vision) Ltd Parking Assistance System
US20180232947A1 (en) * 2017-02-11 2018-08-16 Vayavision, Ltd. Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types
US20180300561A1 (en) * 2017-04-13 2018-10-18 Bayerische Motoren Werke Aktiengesellschaft Method for Detecting and/or Tracking Objects
US20190294174A1 (en) * 2018-03-20 2019-09-26 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and storage medium
US20200064483A1 (en) * 2017-04-28 2020-02-27 SZ DJI Technology Co., Ltd. Sensing assembly for autonomous driving
US20200324795A1 (en) * 2019-04-12 2020-10-15 Nvidia Corporation Neural network training using ground truth data augmented with map information for autonomous machine applications
US20200356582A1 (en) * 2019-05-09 2020-11-12 Ankobot (Shenzhen) Smart Technologies Co., Ltd. Method for updating a map and mobile robot
US20210131823A1 (en) * 2018-06-22 2021-05-06 Marelli Europe S.P.A. Method for Vehicle Environment Mapping, Corresponding System, Vehicle and Computer Program Product
US20220091252A1 (en) * 2019-06-06 2022-03-24 Huawei Technologies Co., Ltd. Motion state determining method and apparatus

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009006113B4 (en) * 2008-03-03 2019-03-28 Volkswagen Ag Device and method for sensor fusion with dynamic objects
DE102010023199A1 (en) * 2010-06-09 2011-02-10 Daimler Ag Method for operating image capturing device for realizing outer mirror functions of motor vehicle, involves monitoring momentary state of cameras in dependence of result of comparison and/or merging of environmental characteristics
DE102014014295A1 (en) * 2014-09-25 2016-03-31 Audi Ag Method for monitoring a calibration of a plurality of environmental sensors of a motor vehicle and motor vehicle
DE102015010535A1 (en) * 2015-08-12 2016-02-18 Daimler Ag Camera-based environmental detection for commercial vehicles
DE102015016057A1 (en) * 2015-12-11 2016-06-23 Daimler Ag Sensor arrangement and vehicle
DE102016212734A1 (en) * 2016-07-13 2018-01-18 Conti Temic Microelectronic Gmbh Control device and method

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100217439A1 (en) * 2009-02-23 2010-08-26 Samsung Electronics Co., Ltd. Map building apparatus and method
US20160137207A1 (en) * 2013-07-26 2016-05-19 Bayerische Motoren Werke Aktiengesellschaft Method and Apparatus For Efficiently Providing Occupancy Information on the Surroundings of a Vehicle
US20150266475A1 (en) * 2014-03-20 2015-09-24 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Establishing a Trajectory for a Vehicle
US20160171893A1 (en) * 2014-12-16 2016-06-16 Here Global B.V. Learning Lanes From Radar Data
US20160179094A1 (en) * 2014-12-17 2016-06-23 Bayerische Motoren Werke Aktiengesellschaft Communication Between a Vehicle and a Road User in the Surroundings of a Vehicle
US20170334353A1 (en) * 2015-02-09 2017-11-23 Applications Solutions (Electronic and Vision) Ltd Parking Assistance System
WO2017097786A2 (en) * 2015-12-07 2017-06-15 Valeo Schalter Und Sensoren Gmbh Driver-assistance method and device
US20170247036A1 (en) * 2016-02-29 2017-08-31 Faraday&Future Inc. Vehicle sensing grid having dynamic sensing cell size
US20180232947A1 (en) * 2017-02-11 2018-08-16 Vayavision, Ltd. Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types
US20180300561A1 (en) * 2017-04-13 2018-10-18 Bayerische Motoren Werke Aktiengesellschaft Method for Detecting and/or Tracking Objects
US20200064483A1 (en) * 2017-04-28 2020-02-27 SZ DJI Technology Co., Ltd. Sensing assembly for autonomous driving
US20190294174A1 (en) * 2018-03-20 2019-09-26 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and storage medium
US20210131823A1 (en) * 2018-06-22 2021-05-06 Marelli Europe S.P.A. Method for Vehicle Environment Mapping, Corresponding System, Vehicle and Computer Program Product
US20200324795A1 (en) * 2019-04-12 2020-10-15 Nvidia Corporation Neural network training using ground truth data augmented with map information for autonomous machine applications
US20200356582A1 (en) * 2019-05-09 2020-11-12 Ankobot (Shenzhen) Smart Technologies Co., Ltd. Method for updating a map and mobile robot
US20220091252A1 (en) * 2019-06-06 2022-03-24 Huawei Technologies Co., Ltd. Motion state determining method and apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220185294A1 (en) * 2020-11-20 2022-06-16 Commissariat A L'energie Atomique Et Aux Energies Alternatives Iterative method for estimating the movement of a material body by generating a filtered movement grid

Also Published As

Publication number Publication date
WO2021104805A1 (en) 2021-06-03
CN114730495A (en) 2022-07-08
DE102019132363A1 (en) 2021-06-02

Similar Documents

Publication Publication Date Title
US10578710B2 (en) Diagnostic method for a vision sensor of a vehicle and vehicle having a vision sensor
US9002631B2 (en) Vicinity environment estimation device with blind region prediction, road detection and intervehicle communication
US8233663B2 (en) Method for object formation
EP3745376B1 (en) Method and system for determining driving assisting data
Giacalone et al. Challenges in aggregation of heterogeneous sensors for Autonomous Driving Systems
US9273971B2 (en) Apparatus and method for detecting traffic lane using wireless communication
US11433897B2 (en) Method and apparatus for determination of optimal cruising lane in an assisted driving system
US20130325311A1 (en) Apparatus and method for detecting moving-object around vehicle
US11859997B2 (en) Electronic device for generating map data and operation method thereof
US20210387616A1 (en) In-vehicle sensor system
US20200174113A1 (en) Omnidirectional sensor fusion system and method and vehicle including the same
CN110040135A (en) Controller of vehicle and control method for vehicle
US10759415B2 (en) Effective rolling radius
CN112046481B (en) Automatic driving device and method
US20200341111A1 (en) Method and apparatus for radar detection confirmation
CN112498347A (en) Method and apparatus for real-time lateral control and steering actuation evaluation
US20200292707A1 (en) Road surface detection device
CN113454555A (en) Trajectory prediction for driving strategies
CN110869865A (en) Method for operating a highly automated vehicle (HAF), in particular a highly automated vehicle
US20220414151A1 (en) Method for Operating a Surroundings Sensing Device with Grid-Based Evaluation and with Fusion, and Surroundings Sensing Device
CN115214670A (en) Apparatus for assisting driving and method thereof
CN110023781B (en) Method and device for determining the exact position of a vehicle from a radar signature of the surroundings of the vehicle
JP4850531B2 (en) In-vehicle radar system
CN113525358A (en) Vehicle control device and vehicle control method
US20230174060A1 (en) Vehicle control device, vehicle control method, and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: BAYERISCHE MOTOREN WERKE AKTIENGESELLSCHAFT, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:STEYER, SASCHA;TANZMEISTER, GEORG;SIGNING DATES FROM 20201109 TO 20201208;REEL/FRAME:060051/0996

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED