DE102009006113B4 - Device and method for sensor fusion with dynamic objects - Google Patents

Device and method for sensor fusion with dynamic objects

Info

Publication number
DE102009006113B4
DE102009006113B4 DE102009006113.4A DE102009006113A DE102009006113B4 DE 102009006113 B4 DE102009006113 B4 DE 102009006113B4 DE 102009006113 A DE102009006113 A DE 102009006113A DE 102009006113 B4 DE102009006113 B4 DE 102009006113B4
Authority
DE
Germany
Prior art keywords
sensor
objects
fusion
object
existence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
DE102009006113.4A
Other languages
German (de)
Other versions
DE102009006113A1 (en
Inventor
Thien-Nghia Nguyen
Dr. Meinecke Marc Michael
Dr. To Thanh-Binh
Prof. Dr. Michaelis Bernd
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to DE102008012197 priority Critical
Priority to DE102008012197.5 priority
Application filed by Volkswagen AG filed Critical Volkswagen AG
Priority to DE102009006113.4A priority patent/DE102009006113B4/en
Publication of DE102009006113A1 publication Critical patent/DE102009006113A1/en
Application granted granted Critical
Publication of DE102009006113B4 publication Critical patent/DE102009006113B4/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems where the wavelength or the kind of wave is irrelevant
    • G01S13/72Radar-tracking systems; Analogous systems where the wavelength or the kind of wave is irrelevant for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems where the wavelength or the kind of wave is irrelevant for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00791Recognising scenes perceived from the perspective of a land vehicle, e.g. recognising lanes, obstacles or traffic signs on road scenes
    • G06K9/00805Detecting potential obstacles
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • G06K9/6288Fusion techniques, i.e. combining data from various sources, e.g. sensor fusion
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0013Optimal controllers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/42Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/52Radar, Lidar

Abstract

A method for providing an environment representation of a vehicle having at least one first sensor device (2) and at least one second sensor device (3) and an evaluation device (4), wherein the sensor devices (2, 3) information about detected in an environment of the vehicle (6) objects (10, 14) in the form of sensor objects, wherein a sensor object represents an object (10, 14) recognized by the respective sensor device (2, 3), and the sensor objects comprise as attribute at least one probability of existence of the represented object (10, 14)
and the sensor objects detected by the at least one first sensor device (2) and by the at least one second sensor device (3) are subjected to an object fusion (66) in which fusion objects are generated to which at least one probability of existence is assigned as an attribute,
characterized in that
the probabilities of existence of the fusion objects are fused based on the existence probabilities of the sensor objects, wherein the fusing of the probability of existence of one of the sensor objects respectively depending on the respective sensor device (2, 3), from which the corresponding sensor object is provided, and wherein for those determined at time k Fusion objects a prediction is made at a time k + 1 and the sensor objects determined at time k + 1 are associated with the fusion objects, with the prediction they match according to association specifications, the likelihood of the fusion object at time k + 1 based on the likelihood of the fusion object to Time k and the likelihood of existence at time k + 1 of the sensor object is fused, and
wherein the likelihood of existence of a fusion object that can be associated with a sensor object is calculated according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) δ A P S e n O b j ( k + 1 ) δ B ] 1 - γ [ 1 - ( 1 - P F u s O b j ( k ) δ A ) ( 1 - P S e n O b j ( k + 1 ) δ B ) ] γ .
Figure DE102009006113B4_0001
where P FusObj (k) the probability of existence of the fusion object at time k, P SenObj (k) the probability of existence of the sensor object at time k, γ a sensor-independent fusing constant and δ a and δ b sensor device-dependent fusing constants, where: δ a + δ b = 1..
Figure DE102009006113B4_0002

Description

  • The invention relates to a method and a device for a vehicle for describing an environment of the vehicle, which is detected by at least one first sensor device and at least one second sensor device and whose information provided is evaluated in an evaluation device in order to provide an environmental representation. In particular, the invention relates to a method for providing an environment representation of a vehicle having at least one first sensor device and at least one second sensor device and an evaluation device. Furthermore, the invention relates to a device for providing an environment representation of a vehicle, comprising at least a first sensor device and at least one second sensor device which detect the environment, and an evaluation device which is coupled to the at least one first sensor device and to the at least one second sensor device.
  • The knowledge of an environment of a vehicle is for different vehicle systems, such as e.g. Security systems, of great importance. With sensor devices that include, for example, an ultrasonic sensor or a camera, data about the environment are detected. By using information from different sensor devices, a representation of the environment is generated, which should reflect the reality of the environment as well as possible. In the environment are static objects, such as trees, houses, etc., and dynamic objects that move in a fixed frame of reference, such as people, animals, automobiles, etc. To provide sufficient information about these moving objects (dynamic objects) in the environment Environment of the vehicle at any time, is a continuous monitoring, ie a continuous acquisition of measurement data, about the environment necessary. As a rule, the sensor devices operate in continuously iterative repetitive measuring cycles.
  • From the DE 10 2006 035 207 A1 a device is known which determines by means of a camera device an approximate value for a position of a possible object. In an evaluation device, two limit values for the position of the object are determined on the basis of the data provided by the camera. A Laser Imaging Detection and Ranging System (Lidar) or a radar system is used to check whether an object is actually in the position range defined by the first limit and the second limit.
  • From the DE 10 2005 009 146 A1 a driver assistance system for motor vehicles is known, with a sensor device for measuring data on the environment of the vehicle, at least two assistance functions and a data processing device that evaluates the measured data and generates at least one specific environmental hypothesis for each assistance function, the output data in one for the assistance function in question prepared form. In the data processing device, at least one environmental hypothesis is predefined, which has a structure subdivided into several sub-hypotheses and in which the partial hypotheses are in a logical relationship, that output data of a sub-hypothesis in the generation of the other sub-hypothesis flow and that at least two assistance functions directly or indirectly to access a common sub-hypothesis.
  • From the DE 695 22 486 T2 For example, a method for tracking moving bodies within a space monitored by multiple sensors is known. From the EP 0 913 751 A1 Furthermore, an autonomous vehicle and a method for controlling an autonomous vehicle are known.
  • It is an object of the invention to provide a method and a device which make it possible to provide an improved environmental representation of an environment of a vehicle with static and dynamic objects.
  • The object is achieved by a method having the features of patent claim 1 and a device having the features of claim 10. Advantageous embodiments of the invention will become apparent from the dependent claims.
  • In particular, a method is provided for providing an environment representation of a vehicle having at least one first sensor device and at least one second sensor device and an evaluation device, in which the sensor devices provide information about objects detected in an environment of the vehicle in the form of sensor objects, wherein a sensor object is one of the represents the object detected by the respective sensor device, and the sensor objects as attribute comprise at least one probability of existence of the represented object and the sensor objects detected by the at least one first sensor device and by the at least one second sensor device Object fusion are produced in the fusion objects are generated, which at least one probability of existence is assigned as an attribute, wherein the existence probabilities of the fusion objects is fused based on the existence probabilities of the sensor objects, wherein the fusing of the likelihood of existence of one of the sensor objects takes place depending on the respective sensor device of the corresponding sensor object is provided. A corresponding device for providing an environment representation of a vehicle comprises at least one first sensor device and at least one second sensor device, which detect the environment, and an evaluation device, which is coupled to the at least one first sensor device and the at least one second sensor device, wherein the sensor devices are formed To provide information about each detected objects in the form of sensor objects, which comprise as attribute at least one probability of existence for the detected object, and the evaluation unit is adapted to subject the at least one first sensor device and detected by the at least one second sensor device sensor objects of an object fusion in which fusion objects are generated, to which at least one probability of existence is assigned as an attribute, wherein the fusion of the probability of existence of the fusion object is based on a is carried out on the existence probabilities of the sensor objects and takes place depending on the respective sensor device from which the respective sensor object is provided. Although the individual sensor devices already independently recognize objects and provide these detected objects as sensor objects, generally in the form of a list of sensor objects, it is provided according to the invention to subject the sensor objects provided by the different sensor devices to a fusion. Here, fusion objects are generated. Since, in particular, a probability of existence of the recognized objects is of particular importance, the fusion objects comprise at least one probability of existence as an attribute. By merging the sensor objects, fusion objects are created that represent the real existing objects. Due to the fusion of the sensor objects, which are each detected and detected by one of the sensor devices, the probabilities of existence of the fusion objects (as well as other attributes) can be significantly improved by a fusion of the sensor objects to the fusion objects. Since the individual sensor devices differ with regard to their quality of measurement and / or susceptibility to errors, for example due to noise, susceptibility to changing environmental conditions, etc., it is provided that, in the case of the fusion of the probabilities of existence of the individual sensor objects with the probability of existence of the fusion objects, fusion takes place, taking account in each case the sensor device takes place, which has provided the respective sensor object whose sensor object existence probability is fused into the existence probability for the fusion object. By merging the sensor objects into fusion objects, a significantly improved environment representation is achieved. In particular for safety-relevant vehicle systems, which depend on the existence of objects in the surroundings of the motor vehicle, a good knowledge about the actual existence of objects in the environment is necessary. By fusing the probabilities of existence of the sensor objects to a probability of existence of fusion objects that represent the real objects, the statement about the probability of existence of objects in the environment of the motor vehicle is thus significantly improved. This can significantly reduce or completely avoid false tripping of security systems, which, for example, trigger restraint systems before the actual arrival of a collision with an object in the environment, which is recognized as unavoidable.
  • In an embodiment according to the invention, it is provided that a prediction is made for a time k + 1 for the fusion objects determined at the time k and the sensor objects determined at the time k + 1 are associated with the fusion objects whose prediction they match in accordance with association specifications, wherein the Existence probability of the fusion object at time k + 1 is fused on the basis of the probability of existence of the fusion object at time k and the probability of existence at time k + 1 of the sensor object. The association requirement may be a Mahalanobis distance default. It is thus possible to associate the fusion objects generated in previous merging steps with the currently provided sensor objects and to optimally evaluate the previously obtained information, in particular with regard to the probability of existence of the objects existing in the surroundings.
  • According to the invention, the existence probability of a fusion object that can be associated with a sensor object is calculated according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) δ A P S e n O b j ( k + 1 ) δ B ] 1 - γ [ 1 - ( 1 - P F u s O b j ( k ) δ A ) ( 1 - P S e n O b j ( k + 1 ) δ B ) ] γ .
    Figure DE102009006113B4_0003
    where P FusObj (k) indicates the existence probability of the fusion object at time k, P SenObj (k) the probability of existence of the sensor object at time k, γ a sensor-independent fusing constant and δ a and δ b sensor device-dependent fusing constants, where: δ a + δ b = 1. The given formula, which is borrowed from the fuzzy logic, allows via the fixed choice of the constant γ a preselection, whether the merger rather an AND-link or rather an OR-linking of the two probabilities of existence, ie the likelihood of existence of the fusion object at the time k and the likelihood of existence of the sensor object at time k + 1 should correspond. In addition, via the sensor device-dependent fusing constants δ a and δ b it is possible to weight the existence probabilities provided by different sensor devices differently relative to the probability of existence already fused in the fusion object. This can take into account that the reliability of existence probability statements of the individual sensor devices have a different reliability and quality. Furthermore, depending on the quality of the association, a weighting of the "old" fusion object existence probability P FusObj (k) relative to the existence probability P SenObj (k + 1) of the sensor object can also be carried out.
  • In order to be able to take account of new objects occurring in the environment during the fusion, it is provided in one embodiment of the invention that for sensor objects that can not be associated with any of the known fusion objects, the fusion comprises generating a new fusion object to which the likelihood of existence of Assigned to the sensor object.
  • While an environmental representation of sensor objects and fusion objects fused thereto are particularly well-suited for moving, i. In a fixed reference moving objects in the environment of the vehicle is suitable, an improved environment representation can be achieved if additional information provided by the sensor devices, on the basis of which an occupancy map is created, the environment is divided into areas and each area one cell the cells together represent the occupancy card, and it is intended to determine occupancy probabilities for the cells indicating a measure of an existence of an object in the region of the environment associated with the cell. In addition to an object recognition in the sensor devices, which leads to the derivation of the sensor objects, thus information is provided, which make it possible to create an occupancy map of the environment. The creation of the occupancy map is usually carried out so that the individual cells at least one value is assigned, which indicates the probability of existence of an object in the space associated with the cell area of the environment. An advantage of such a method for representing the surroundings or providing an environment representation is the use of the information of the at least two sensor devices as completely as possible, wherein, on the one hand, the good occupancy probability statements are generated in the occupancy maps formed by the cells, which have a high significance in particular for static objects and on the other hand also for dynamic objects high probabilities of existence can be obtained via the objects of fusion.
  • A clear further improvement can be achieved if the results of the object-oriented environment representation are combined with a map-based environment representation. In a preferred embodiment, it is therefore provided that, for fusion objects which can not be assigned to any of the sensor objects at a measurement time k + 1, the existence probability is fused instead of determining the probability of existence of the corresponding sensor object using the occupancy probability for the area or areas at which the fusion object is located according to the prediction. Such situations, in which no corresponding sensor object for a fusion object is supplied by a measuring device, occur, for example, in a radar system if another static or dynamic object in the environment conceals the corresponding object corresponding to the fusion object in the meantime. This means that the object in the environment corresponding to the fusion object can not be detected by the radar beams because another object, for example, a bridge pier or a road sign mounted on a traffic island, the field of vision of the radar on the corresponding real object in obscured one or more measuring cycles. In such a situation, a sensor device is often unable to provide a sensor object for this real object, since the sensor device for this purpose must be able to detect and track the real object itself over a certain period of time to ensure that a real existing object exists. By contrast, the raw data can often provide earlier and more reliably a statement about an occupancy of a certain area in the environment of the motor vehicle. Thus, the fusing is improved by using known probabilities of existence from the mappings map for objects in a particular area corresponding to the area in which the fusion object moved according to the prediction.
  • In one embodiment, it is provided that the fusion takes place according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) P G r i d ( k + 1 ) ] 1 - γ [ 1 - ( 1 - P F u s O b j ( k ) ) ( 1 - P G r i d ( k + 1 ) ) ] γ .
    Figure DE102009006113B4_0004
    where P Grid (k + 1) indicates the occupancy probability derived from the occupancy map at the predicted location of the fusion object. In this case, γ again specifies a fusing constant which determines whether the fusing is more similar to an AND combination or an OR combination of the probabilities of existence or occupancy probabilities.
  • Furthermore, it is advantageous to carry out a classification of the objects. Here, a classification of the sensor objects in terms of their type and preferably a classification of the fusion objects in terms of their type. A classification refers to an object type or an object class. These include pedestrians, cyclists, cars, trees, houses, etc. Particularly advantageously, the classification is made with comparative data, which are either present in the sensor devices and / or stored in the evaluation. For example, reference data can serve as comparison data. For a sensor device which is designed as a stereo camera, for example, a height ratio to a width ratio of a determined object within a defined spatial area or a defined area of the environment can be determined. On the basis of this ratio can be determined, for example, if it is a pedestrian. The classification based on reference data can be carried out both in the sensor devices and in the evaluation device or alternatively in the sensor devices or the evaluation device. The classification thus represents a possible attribute of the sensor objects and also of the fusion objects. As further attributes, specifications about the geometry of the objects, such as a length, a width and / or a height are determined. It is also advantageous to determine a dynamics of the objects. By the dynamics of the objects is meant a yaw rate, a velocity, an orientation as well as an acceleration etc. of an object. Furthermore, dynamics that characterize a vehicle-related driving behavior, such as an overtaking process, an entry and / or Ausparkvorgang, an emergency braking process, an acceleration process, a deceleration process, a crossing of a lane of pedestrians and / or cyclists, etc., to name a few characterizing the dynamics To call properties of the dynamics of objects.
  • Like the probabilities of existence of the individual sensor objects, other attributes can also be subjected to a fusion. A preferred embodiment therefore provides that the sensor objects and fusion objects are subjected to a fusion process with regard to further attributes, for example using a Kalman filter. This means that dynamics and prediction can be carried out, for example, by means of a Kalman filter.
  • For further embodiment of the invention, it is advantageous that the information processed in the evaluation unit are combined to form a detailed description of the environment. The evaluation unit can be a computer that receives information, processes and combines the results. Advantageously, the assembled results can also be graphically processed by the evaluation unit so that they can be processed on an output unit, e.g. a display device can be displayed. The detailed results are advantageously used by other systems that rely on environmental information, for example, to steer the vehicle, intervene in the longitudinal and / or transverse guidance and / or dynamics, take pre-crash measures, etc. Therefore, it is advantageous if the environment representation is provided at least in electronic form.
  • In the evaluation device information of an expert system are located or implemented. The expert knowledge, which is partly contained in the fusing rules and formulas, can additionally evaluate and utilize further information in order to improve the information fusion. In one embodiment of the invention, information about the surroundings and / or the at least one sensor device and / or the at least one second device is detected by means of at least one third sensor device and the sensor device dependency of the fusion varies depending on the information of the at least one third sensor device becomes. The third sensor device may be, for example, a sensor that detects rain in the environment of the motor vehicle. If one of the sensor devices is, for example, a stereo camera, then it is known that this. is disturbed in heavy rain in terms of object detection. Thus, it is advantageous, for example in the fusing of the probabilities of existence of the sensor objects, to merge the information provided by the sensor device designed as a stereo camera into the fusion objects with less weight than is the case when dry conditions prevail. Likewise play in particular light conditions, for example a low-level, directed against the direction of travel solar radiation a crucial role for a quality of the data received by means of the stereo camera. Other sensor devices that use other measurement methods and principles can also deliver different reliable measurement results depending on environmental influences. Also advantageous is an expert system, which compares information of the sensor devices, for example with regard to their classification, and at least incorporates the data of individual sensor objects only in the fusion, if they are plausibility and / or confirmed by a sensor object of another sensor device. For example, if a radar detects a stationary manhole cover on a road, then in one embodiment this information is only relayed by the expert system when the information is acknowledged by a camera.
  • In particular, in order to consistently improve an object-based environment representation, such as provided via the merge objects, with the environment representation provided via the occupancy card, in one embodiment it is provided that tracking information is associated with the merger objects, which is a measure of a goodness of a follow-up of the entity Indicates the fusion object over a plurality of measurement times, and for the fusion objects representing moving objects for which the tracking information indicates sufficient tracking quality, the occupancy map cells in the extension area of the fusion object are designated as occupied and those occupancy probabilities are assigned to those cells; Corresponds to the fusion object. This creates an occupancy map representation in which both the static objects and the dynamically moving objects are represented with a high probability of existence at the positions that correspond to the optimal "knowledge level" of Umweisfassung. It is understood that the cells to which assignment and likelihood of existence or occupancy probability are assigned via the fusing objects are identified separately, so that when the occupancy map is updated, the movement of the fusion object is taken into account.
  • In this case, an embodiment proves to be particularly advantageous in which the occupancy probabilities determined during the creation of the occupancy card without taking into account the fusing objects are stored for the cells if their occupancy probability due to a fusing object is changed. If the fusing object has moved away from the region associated with the cell, i. Then, the occupancy state and / or occupancy probability of the cell previously stored may be reassigned to the cell or included in the fusion of the newly acquired sensor device information.
  • The at least one first sensor device preferably comprises a stereo camera and / or the at least one second sensor device comprises a radar or lidar (laser imaging detection and ranging system).
  • The advantages described in connection with the method apply correspondingly to the corresponding device features of the device according to the invention.
  • The invention will be explained in more detail with reference to a drawing with reference to embodiments. Hereby show:
    • 1 a schematic block diagram of an apparatus for providing an environment representation of a vehicle;
    • 2 a schematic representation of an explanation of a card-based environment representation;
    • 3 a schematic representation for describing the information acquisition with a radar system;
    • 4 a schematic representation for explaining the information acquisition by means of a stereo camera; and
    • 5 a schematic representation of a method for environment representation.
  • In 1 is schematically an embodiment of a device 1 for providing an environment representation of a vehicle. The device 1 comprises a first sensor device, which is designed for example as a radar or lidar. Such a device emits in different directions electromagnetic radiation which is reflected back to objects in the environment to the vehicle. Based on the transit time and a possible frequency shift, a distance and a relative speed of the object to the vehicle can be determined, at which the first sensor device 2 is arranged. The device 1 further comprises a second sensor device 3 , which is designed for example as a stereo camera. A stereo camera comprises two image acquisition units arranged at a distance from one another, which capture, for example, a left and a right camera image of the same spatial region of the environment. The stereo camera uses a disparity between the left and right camera images to calculate depth information of objects that are detected in the images. As a result, a 3D reconstruction of the objects detected in the images in the environment of the vehicle is possible on which the second sensor device is arranged. It is understood by those skilled in the art that the device for providing an environment representation may comprise further sensor devices which detect the surroundings of the vehicle according to the same or different measurement principles.
  • The first sensor device 2 and the second sensor device 3 are designed so that they make a pre-evaluation of the information collected and identify objects in the environment of the motor vehicle. The information about the individual detected objects is combined into a so-called sensor object, which comprises the attributes that characterize the corresponding object. These attributes may include, for example, the position, a velocity, an orientation of the motion, an extension in the different spatial directions, and so on. Likewise, the individual recognized objects can be classified according to their object type. This can be done, for example, by comparing the acquired data with reference data. The sensor devices 2 . 3 are further designed so that all sensor objects, ie all detected objects, are provided with an existence probability as an attribute. This likelihood of existence is a measure of the fact that in the environment detection of one of the sensor devices 2 . 3 recognized object actually exists in the environment of the motor vehicle.
  • The first sensor device 2 and the second sensor device 3 are with an evaluation device 4 coupled. The information about the detected sensor objects, preferably in the form of an object list, is transmitted to them. In the evaluation device, the detected sensor objects of the various sensor devices 2 . 3 merged into merger objects. Different methods of fusion can be used here. These can be adapted for the individual attributes. The evaluation device 4 For example, an expert system 5 in which expert knowledge is stored, which makes it possible to classify the fusion objects generated in the fusion more reliably than, for example, the individual sensor devices is possible. For example, while the spatial dimensions of a detected object can often be determined well using a stereo camera, it is not possible to determine dynamic statements, for example a relative speed to the vehicle. However, this information can be determined, for example, from a radar system based on the Doppler effect. If the information about a spatial extent and a detected relative speed are related to the known intrinsic speed of the vehicle, then individual objects that are detected can be classified more reliably. A parked refuse container and a small car, for example, have approximately identical dimensions. On the basis of a determined intrinsic speed of the detected object in a stationary coordinate system, however, it is possible to distinguish such a small car participating in the road from a parked container.
  • In particular, the evaluation device is designed for the first sensor device 2 and the second sensor device 3 detected sensor objects to identify fusion objects and in particular for this one existence probability from the existence probabilities of the sensor objects to merge. The fusion is carried out depending on the sensor device. For example, a weight of a likelihood of existence of a sensor object in the fusion is reduced when using a third sensor device 17 , which is formed for example as a rain sensor, rain is detected, which adversely affects a data acquisition of a camera (but not a radar measurement), and the sensor object of the configured as a stereo camera second sensor device 3 is provided. In some embodiments, it may be provided that the sensor devices 2 . 3 are designed so that they recognize as objects only those that move in a fixed coordinate system itself. This means that only dynamic objects are detected as sensor objects or to the evaluation device 4 be transmitted.
  • In order to obtain an improved environment representation, provide the first sensor device 2 and the second sensor device 3 to the evaluation device 4 In addition, information that enables the evaluation device to create an occupancy map for the environment of the vehicle. For this purpose, the environment of the vehicle, ie its environment, mentally divided into areas. Each area is then assigned a cell. The individual cells then form a so-called occupancy card. Each cell is assigned an occupancy state, which indicates whether an object is in the area of the environment assigned to the cell (state "occupied") or no object is located (state "free"). The cells may also assume the status "unknown" if it is not possible to do so based on the information collected To give assertion about the corresponding area of the surroundings of the vehicle with regard to an occupancy state.
  • The person skilled in the art knows methods of how to create such an occupancy card. It is common for each cell to have at least one occupancy probability associated with it, which is a statistical measure of the cell being occupied or free. If the occupancy probability for the cell is limited to a value between zero and one, it is for example possible to assign the value "one" to the state "occupied" and the value zero to the state "free". Since individual measurements indicate only a finite probability for an existence of an object in a corresponding spatial area, it is advantageous to secure or to confirm the occupancy probability of a cell in the occupancy map by fusing the occupancy information of different measurement cycles. Frequently used as fusion methods are the Bayesian theorem, the Dempster-Shafer theory or a fuzzy logic. For example, the occupancy probability P Beieg i ( k + 1 )
    Figure DE102009006113B4_0005
    of a cell i at time k + 1 are calculated using the following fusion formula: P Beieg i ( k + 1 ) = P Beieg i ( k ) · P Beieg i .Sensor ( k + 1 ) P Beieg i ( k ) · P Beieg i ( k + 1 ) + [ ( 1 - P Beieg i ( k ) ) · ( 1 - P Beieg i .Sensor ( k + 1 ) ) ]
    Figure DE102009006113B4_0006
  • Here are P Beieg i .Sensor ( k + 1 )
    Figure DE102009006113B4_0007
    the occupancy probability for cell i determined by the sensor device at time k + 1. Cells are initialized with an occupancy probability P slip = 0.5, which means that the occupancy state is unknown.
  • In 2 is schematic in the upper part of a plan view of a street scene shown in which a motor vehicle 6 on a street 7 in one by an arrow 8th specified direction moves. At the left in the direction of travel roadside 9 are formed as trees and shrubs objects 10 , At the right in the direction of travel roadside 11 there is a guardrail 12 , Further, located on a side strip 13 another object 14 , To facilitate an orientation is a coordinate system 15 located. In the lower part of the representation of the 2 is an occupancy card 16 shown as coming from the motor vehicle 6 while driving has been detected. An occupancy probability is indicated by a hatching. The closer the hatching, the higher the occupancy probability. The motor vehicle itself is based on the occupancy probability 1 Although this is of course not detected by the sensor devices. In the occupancy card, the recognized items with the same reference numerals as in the upper part of 1 marked, but provided with an apostrophe.
  • Based on 3 and 4 It will be explained how the various sensor devices acquire and evaluate information for creating an occupancy map and for detecting sensor objects.
  • In 3 is a photographic one on the right an area in front of a motor vehicle as viewed from the motor vehicle. To recognize is a roadway 22 on which a preceding vehicle 23 located. Further, the lane is on the left 22 a street tree 24 to recognize. At a greater distance laterally to the left of the road 22 are also parked vehicles 25 weak to recognize. On the left side of the 3 the raw data collected by a radar system is plotted graphically. The ordinate 27 indicates a distance in front of the motor vehicle while the abscissa 28 oriented transversely to the direction of travel. In a detection area 32 The radar uses gray levels to encode a signal strength for those signals that have been reflected back to the vehicle from the corresponding area. The darker an area is, the higher the measured signal strength. Good to see are the strong reflection signals 29 that with the street tree 24 correspond, as well as the reflection signals 30 that with the preceding vehicle 23 correspond. Over a weaker hatching are also reflection signals 31 to recognize the parked vehicles 25 can be assigned. This acquired information can be used directly by the evaluation device to create an occupancy card (cf. 1 ) be used. An occupancy probability is given here by the signal strength of the reflection.
  • The sensor device is further configured so that individual objects, here the street tree 24 and the preceding vehicle 23 , identified as objects in the environment of the motor vehicle and, where appropriate be classified for example via a determined airspeed by evaluating a Doppler shift. A classification can also be done for example via a height-expansion ratio.
  • In 4 Analogously, the data acquisition for a stereo camera is shown schematically. On the left side is again a photographic one of the area in front of the motor vehicle. To recognize is a roadway 42 , several vehicles in front 43 - 45 , a street lamp 46 , some street trees 47 as well as a hedge or wall 48 , On the basis of a disparity between two recordings taken at a distance from one another, a 3D reconstruction of the graphically recorded elements and objects is undertaken. As a result, both a position of the objects in the plane in which the vehicle moves and a height of the objects above this plane can be determined. On the right side of the 4 is a plan view of the space in front of the vehicle graphed. The ordinate 51 again indicates the distance from the vehicle in the direction of travel and the abscissa 52 an orientation transverse to the direction of travel of the vehicle. About a gray level in each case a height of the reconstructed pixels in the corresponding area is indicated. The heights correspond to heights of the detected objects in relation to the plane. To recognize are both the wall / hedge 48 ' as well as the vehicles in front 43 ' - 45 ' as well as the street lamp 46 ' can be seen in the illustration. A measure of the likelihood of existence is given by how many pixels within a surface area associated with the detected object have been detected. The displayed information can thus also be used to improve an occupancy map. Furthermore, it is possible to perform a so-called object tracking due to several consecutively recorded stereo images. In this case, in one embodiment, the detected objects are selected which are dynamic objects. These are then in the form of an object list of the evaluation device 4 (see 1 ) provided.
  • In order to determine the probability of existence of the detected objects improved, they are tracked by the individual sensor devices over several measuring cycles (tracked). In this case, based on the frequency with which an object was previously detected, a probability of existence is determined on the basis of its "lifetime", ie acquisition life. The more often an object is detected, the higher its probability of existence and the less likely it is that the detection is due to a measurement noise or the like. In addition, in a camera, for example, the number of 3D points assigned to the object is evaluated in order to obtain a probability of existence. The more 3D pixels are assigned to an object, the greater is its likelihood. However, it should be noted that because of the imaging perspective, objects at a great distance inevitably cover only a smaller image area and a number of pixels that can be assigned to them are forcibly lower than an object located in the immediate vicinity of the vehicle at detection located. Therefore, in one embodiment, the likelihood of existence is determined based on the number of detected and object-assigned measurement points as a function of a distance from the vehicle. For other sensor devices, other criteria can be used to determine a likelihood of existence of the detected objects. For radar or Lidarsensoren this example, as already mentioned above, a reflection signal strength is used. The individual probabilities of existence so determined for the individual objects are preferably fused in the individual sensor devices via an approach originating from the fuzzy logic, which is specified via the following fusing formulas for the probability of existence of individual objects as determined in the sensor devices. The gamma factors are used to select a link that can be selected between an OR link (y = 1) and an AND link (γ = 0) adapted to the corresponding sensor type. The gamma factors are thus individually adapted for each individual sensor or each individual sensor device. For a stereo camera, the probability of existence of a detected sensor object is thus preferably calculated according to the following formula: P CameraObj ( k ) = [ P lifespan ( k ) · P 3 D - Points ( k ) ] 1 - γ · [ 1 - ( 1 - P lifespan ( k ) ) · ( 1 - P 3 D - Points ( k ) ) ] γ .
    Figure DE102009006113B4_0008
    where P lifelong (k) indicates the probability of existence derived from the number of previous detections, and P indicates 3D points of existence determined from the number of existence of the 3D points associated with the object existence probabilities. Accordingly, the fusion formula for a radar system-detected sensor object is: P R a d a r O b j ( k ) = [ P L e b e n s d a u e r ( k ) · P S i G n a l s t ä i k e ( k ) ] 1 - γ · [ 1 - ( 1 - P L e b e n s d a u e r ( k ) ) · ( 1 - P S i G n a l s t ä i k e ( k ) ) ] γ ,
    Figure DE102009006113B4_0009
  • Here, P signal strength (k) indicates the existence probability of an object determined on the basis of the reflection signal strength .
  • Even if the existence probabilities determined in the individual sensor devices already have a high quality, they can be significantly improved by fusing the sensor objects to fusion objects in the evaluation device. This is exemplary in 5 shown. A first designed as a radar sensor device 61 and a second sensor device designed as a stereo camera 62 capture the environment. These sensor devices 61 . 62 provide both information 63 for a card-based merger 64 can be used as well as information 65 about detected sensor objects for an object-based sensor fusion 66 , In some embodiments, the information includes 63 for the card-based merger, only information that is not relevant to the object-based merger 66 supplied sensor objects can be assigned. Accordingly, object-based fusion uses only those recognized objects that are recognized as dynamic, ie self-moving, objects.
  • At the card-based merger 64 For example, the individual information is associated with occupancy map cells 67 and then an update 68 the individual cells of the occupancy map. Here, in particular, the previously determined occupancy probabilities of the cells are merged with the newly determined occupancy probabilities. You get an occupancy card 69 with cells to which occupancy probabilities are assigned.
  • In the object-based merger 66 be the sensor objects, by the individual sensor devices 61 . 62 in the form of object lists, associated with one another 71 and, for example, via Kalman filter tracking 72 tracked. For the individual merged objects, an initialization is dependent on whether a sensor object can be fused with an object previously detected 73 a merger object or an update 74 of the fusion object to which the sensor object can be associated. For fusion objects, which fall below a given probability of existence and can not be associated with any sensor object, finds a fault 75 instead of.
  • If an object is detected for the first time, the associated sensor object can not be associated with any of the previously determined fusion objects. Thus, a new fusion object is initialized, whose existence probability P FusObj (k) is assigned the existence probability of the corresponding sensor object P Sensobj (k): P FusObj ( k ) = P SenObj ( k ) ,
    Figure DE102009006113B4_0010
  • If a sensor object can be assigned to a fusion object, the probability of existence of the fusion object at time k + 1 is calculated according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) δ A P S e n O b j ( k + 1 ) δ B ] I - γ [ 1 - ( 1 - P F u s O b j ( k ) δ A ) ( 1 - P S e n O b j ( k + 1 ) δ B ) ] γ ,
    Figure DE102009006113B4_0011
  • Here, δ A and δ B indicate weights for the sensor object and the fusion object. The following applies: δ A + δ B = 1. The weights are determined on the basis of a current sensor confidence level and a deviation between the predicted fusion object and the measured sensor object. Thus, if a large deviation in the tracking of the fusion object is detected, the existence probability of the sensor object is weighted less. The weighting of the sensor object is reduced when external influences for a sensor device affect its measurement reliability. For example, rain adversely affects a detection reliability of a stereo camera. Such information can, for example, via the third sensor device 1 at the device 1 to 1 be won. The weights are thus adjusted to current and sensor-dependent. As with the fusion formulas given above, the gamma factor again indicates whether the two fused existence probabilities are to be fused together rather than OR or as AND.
  • If a fusion object can not be assigned to any of the detected sensor objects, in a preferred embodiment its probability of existence is updated by using the occupancy probability at the location where the fusion object is to be located according to the prediction instead of a probability of existence provided by a sensor object on the basis of the occupancy map becomes, to derive an existence probability. A fusion of the likelihood of the merger object in such a case takes place according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) P G r i d ( k + 1 ) ] 1 - γ [ 1 - ( 1 - P F u s O b j ( k ) ) ( 1 - P G r i d ( k + 1 ) ) ] γ ,
    Figure DE102009006113B4_0012
  • γ is again a suitable fusing constant. P Grid (k + 1) indicates the probability of existence derived from the occupancy map at time k + 1, preferably after updating the occupancy map.
  • If a fusion object can not be assigned to any of the sensor objects and if its probability of existence is less than a default value, for example less than 0.2, then the fusion object is discarded, i. not further tracked (tracked).
  • The object-based merger 66 thus delivers objects, fusion objects, with a very reliable probability of existence 76 ,
  • In one embodiment, it is provided that the dynamic objects are classified according to their classification, which indicates an extent, or only selectively, i. in a cell, are entered in the occupancy map and they are assigned there as occupancy probability, the determined likelihood of existence of the corresponding fusion object. This gives a very much improved occupancy map, in which both the static areas and the dynamic objects are each recorded with a high reliability in terms of occupancy probability or probability of existence.
  • In general, the rest of the vehicle systems are provided with an environment representation that optimally characterizes the dynamic objects and the static areas. In this case, it is possible that both the fusion objects with their existence probabilities and an allocation card with and / or without fusion objects inserted therein are provided.
  • It is understood that the illustrated embodiments are merely exemplary embodiments. In particular, more sensor devices can be included in the sensor data fusion. It should also be noted that the fusion of the sensor objects is carried out in each case via fusion objects. This offers the advantage that the individual sensor devices do not have to provide their sensor data synchronized in time. However, other embodiments can very well provide that sensor data of individual different sensor devices are first associated with each other and fused and only then optionally additionally fused with a previously determined fusion object to an updated fusion object.
  • LIST OF REFERENCE NUMBERS
  • 1
    Device for providing an environment representation
    2
    first sensor device
    3
    second sensor device
    4
    evaluation
    5
    expert system
    6
    motor vehicle
    7
    Street
    8th
    arrow
    9
    left edge of the track (in the direction of travel)
    10
    objects
    11
    right side of the road (in the direction of travel)
    12
    barrier
    13
    shoulder
    14
    another object
    15
    coordinate system
    16
    occupancy map
    17
    third sensor device
    21
    image
    22
    roadway
    23
    preceding vehicle
    24
    street tree
    25
    parked vehicle
    27
    ordinate
    28
    abscissa
    29-31
    reflection signals
    32
    detection range
    41
    Illustration
    42
    roadway
    43-45
    preceding vehicles
    46
    street lamp
    47
    tree
    48
    Hedge / wall
    51
    ordinate
    52
    abscissa
    53
    edge strips
    61
    first sensor device
    62
    second sensor device
    63
    Information for a card-based merger
    64
    card based fusion
    65
    Information for a sensor object fusion
    66
    object-based fusion
    67
    Associate with cells
    68
    Update the cells
    69
    occupancy map
    71
    association
    72
    Kalman filter tracking
    73
    initialization
    74
    Update
    75
    rejection
    76
    Provision of existence probability objects
    77
    Providing an environment representation

Claims (19)

  1. A method for providing an environment representation of a vehicle having at least one first sensor device (2) and at least one second sensor device (3) and an evaluation device (4), wherein the sensor devices (2, 3) information about detected in an environment of the vehicle (6) objects (10, 14) in the form of sensor objects, wherein a sensor object represents an object (10, 14) recognized by the respective sensor device (2, 3), and the sensor objects comprise as attribute at least one probability of existence of the represented object (10, 14) and the sensor objects detected by the at least one first sensor device (2) and by the at least one second sensor device (3) are subjected to an object fusion (66) in which fusion objects are generated, to which at least one probability of existence is assigned as an attribute, characterized that the existence probabilities of the fusion objects based be fused to the existence probabilities of the sensor objects, wherein the fusing of the likelihood of existence of the sensor objects depending on the respective sensor device (2, 3) takes place, from which the corresponding sensor object is provided, and wherein for the fusion object determined at time k, a prediction a time k + 1 is made and the sensor objects determined at time k + 1 are associated with the fusion objects whose prediction they match according to association specifications, the likelihood of existence of the fusion object at time k + 1 based on the probability of existence of the fusion object at time k and Existence probability at time k + 1 of the sensor object is fused, and wherein the likelihood of existence of a fusion object that can be associated with a sensor object is calculated according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) δ A P S e n O b j ( k + 1 ) δ B ] 1 - γ [ 1 - ( 1 - P F u s O b j ( k ) δ A ) ( 1 - P S e n O b j ( k + 1 ) δ B ) ] γ .
    Figure DE102009006113B4_0013
    where P FusObj (k) the probability of existence of the fusion object at time k, P SenObj (k) the probability of existence of the sensor object at time k, γ a sensor-independent fusing constant and δ a and δ b sensor device-dependent fusing constants, where: δ a + δ b = 1..
    Figure DE102009006113B4_0014
  2. Method according to Claim 1 , characterized in that for sensor objects that can not be associated with any of the known fusion objects, the fusion comprises generating a new fusion object to which the likelihood of existence of the sensor object is assigned as existence probability.
  3. Method according to one of the preceding claims, characterized in that the sensor devices provide information on the basis of which an occupancy card (16, 69) is created, wherein the environment is divided into areas and each cell is assigned to each cell, the cells represent the occupancy card and determining occupancy probabilities for the cells indicative of a measure of existence of an object in the region of the environment associated with the cell.
  4. Method according to Claim 3 , characterized in that for fusion objects that can be assigned to none of the sensor objects at a measurement time k + 1, the probability of existence is fused by instead of the likelihood of a corresponding sensor object based on the occupancy card (16, 69) determined occupancy probability for the area or Areas at which the fusion object according to the prediction is used.
  5. Method according to Claim 4 , characterized in that the fusion takes place according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) P G r i d ( k + 1 ) ] 1 - γ [ 1 - ( 1 - P F u s O b j ( k ) ) ( 1 - P G r i d ( k + 1 ) ) ] γ .
    Figure DE102009006113B4_0015
    where P Grid (k + 1) indicates the occupancy probability derived from the occupancy map at the predicted location of the fusion object.
  6. Method according to one of the preceding claims, characterized in that the sensor objects are classified according to their type.
  7. Method according to one of the preceding claims, characterized in that the sensor objects and fusion objects are subjected to a fusion process with regard to further attributes, for example using a Kalman filter.
  8. Method according to one of the preceding claims, characterized in that by means of at least one third sensor device (17) information about the environment and or the at least one first sensor device (2) and / or the at least one second sensor device (3) are detected and the sensor device dependence the fusion is varied depending on the information of the at least one third sensor device (17).
  9. Method according to one of Claims 3 to 8th , Characterized in that the Fusionierungsobjekten is a tracking information associated with that indicates a measure of a quality of a tracking of the fusion object over a plurality of measurement times and for fusion objects, the moving objects represent and for which the tracking information indicating a sufficient Nachverfolgungsgüte that of in the area in Assigned to the fusion object lying occupancy of the occupancy map are assigned and assigned to these cells, the occupancy probability that corresponds to the likelihood of existence of the fusion object.
  10. Device (1) for providing an environment representation of a vehicle (6), comprising at least one first sensor device (2) and at least one second sensor device (3), which detect the environment, and an evaluation device (4) which is connected to the at least one first sensor device (2) and the at least one second sensor device (3) is coupled, wherein the sensor devices (2, 3) are designed to provide information about respectively detected objects (9, 14) in the form of sensor objects, which as attribute at least one probability of existence for the detected object (10, 14), and the evaluation device (4) is adapted to subject the at least one first sensor device (2) and of the at least one second sensor device (3) detected sensor objects of object fusion, are generated in the fusion objects to which at least one likelihood of existence is assigned as an attribute, characterized in that s the fusion of a probability of existence of the fusion object based on the likelihood of existence of the sensor objects is executed and depending on the respective sensor device (2, 3) takes place, from which the sensor object is provided, wherein the evaluation device (4) comprises a prediction unit, for the time k determined fusion objects makes a prediction on a time k + 1, which the evaluation device (4), the k + 1 detected sensor objects associated with the fusion objects, with the prediction of the sensor objects according to association specifications match, and the likelihood of the fusion object at time k +1 is fused on the basis of the probability of existence of the fusion object at time k and the probability of existence at time k + 1 of the sensor object, and wherein the evaluation device (4) is designed to detect the probability of existence of a fusion object that is connected to a sensor object kt can be calculated according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) δ A P S e n O b j ( k + 1 ) δ B ] 1 - γ [ 1 - ( 1 - P F u s O b j ( k ) δ A ) ( 1 - P S e n O b j ( k + 1 ) δ B ) ] γ .
    Figure DE102009006113B4_0016
    where P FusObj (k) the probability of existence of the fusion object at time k, P SenObj (k) the probability of existence of the sensor object at time k, γ a sensor-independent fusing constant and δ a and δ b sensor device-dependent fusing constants, where: δ a + δ b = 1..
    Figure DE102009006113B4_0017
  11. Device (1) according to one of Claim 10 , characterized in that the evaluation device (4) is designed for sensor objects, which can not be associated with any of the known fusion objects, to generate a new fusion object in the fusion, which is assigned as existence probability, the existence probability of the sensor object.
  12. Device (1) according to one of Claims 10 or 11 , characterized in that the evaluation device (4) comprises a card generating device and the sensor devices (2, 3) are designed to provide information by means of which the card generating device can create an occupancy map (16, 69), wherein the environment is divided into areas and each Each cell is assigned to the occupancy card (16, 69), and the card generating means is adapted to determine for the cells occupancy probabilities, which is a measure of an existence of an object (10, 14) in the cell associated area of the environment.
  13. Device (1) according to Claim 12 , characterized in that the evaluation device is designed so that for fusion objects that can be assigned to none of the sensor objects at a measurement time k + 1, the likelihood of existence is fused by instead of the likelihood of a corresponding sensor object based on the occupancy card (16, 69) determined Occupancy probability is used for the area or areas where the fusion object is located according to the prediction.
  14. Device (1) according to Claim 13 , characterized in that the fusion takes place according to the following formula: P F u s O b j ( k + 1 ) = [ P F u s O b j ( k ) P G r i d ( k + 1 ) ] 1 - γ [ 1 - ( 1 - P F u s O b j ( k ) ) ( 1 - P G r i d ( k + 1 ) ) ] γ .
    Figure DE102009006113B4_0018
    where P Grid (k + 1) indicates the occupancy probability derived from the occupancy map at the predicted location of the fusion object.
  15. Device (1) according to one of Claims 10 to 14 , characterized in that in at least one of the sensor devices (2, 3) and or the evaluation device (4) comparative information is available, based on which the sensor objects are classifiable with respect to their type.
  16. Device (1) according to one of Claims 10 to 15 , characterized in that the evaluation device (4) is designed to subject the sensor objects and fusion objects with respect to further attributes to a fusion process, for example using a Kalman filter.
  17. Device (1) according to one of Claims 10 to 16 , characterized in that at least one third sensor device (17) is coupled to the evaluation device, which detects information about the environment and / or the at least one first sensor device (2) and / or the at least one second sensor device (3), and the evaluation device (4) is adapted to vary the sensor device dependency of the fusion depending on the information of the at least one third sensor device (17).
  18. Device (1) according to one of Claims 12 to 17 characterized in that the evaluation device (4) is adapted to associate with the fusing objects a tracking information indicating a measure of a quality of a tracking of the fusion object over a plurality of measurement times, and for the fusion objects representing the moving objects and for which the tracking information is a indicates sufficient tracking quality to mark the cells of the occupancy map (16, 69) lying in the expansion area of the fusion object as occupied and to assign to these cells the occupancy probability corresponding to the likelihood of existence of the fusion object.
  19. Device (1) according to one of Claims 10 to 18 , characterized in that the at least one first sensor device (2) comprises a stereo camera and / or the at least one second sensor device (3) comprises a radar or lidar (Laser Imaging Detection and Ranging System).
DE102009006113.4A 2008-03-03 2009-01-26 Device and method for sensor fusion with dynamic objects Active DE102009006113B4 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE102008012197 2008-03-03
DE102008012197.5 2008-03-03
DE102009006113.4A DE102009006113B4 (en) 2008-03-03 2009-01-26 Device and method for sensor fusion with dynamic objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102009006113.4A DE102009006113B4 (en) 2008-03-03 2009-01-26 Device and method for sensor fusion with dynamic objects

Publications (2)

Publication Number Publication Date
DE102009006113A1 DE102009006113A1 (en) 2009-09-10
DE102009006113B4 true DE102009006113B4 (en) 2019-03-28

Family

ID=40936482

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102009006113.4A Active DE102009006113B4 (en) 2008-03-03 2009-01-26 Device and method for sensor fusion with dynamic objects

Country Status (1)

Country Link
DE (1) DE102009006113B4 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102614003B (en) * 2011-01-31 2015-10-28 柯惠Lp公司 For lock cam driver and the clamp assemblies of applicator

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010006828A1 (en) * 2010-02-03 2011-08-04 Volkswagen AG, 38440 Method for creating model of surrounding of vehicle, involves detecting surrounding of vehicle with help of sensor, where two surrounding models are produced automatically on basis of sensor data
DE102010003375A1 (en) * 2010-03-29 2011-09-29 Bayerische Motoren Werke Aktiengesellschaft Surrounding evaluation system, particularly for use in speed monitoring systems in vehicle, has electronic controller and sensor mediums for detecting objects in surrounding of vehicle
DE102010023199A1 (en) 2010-06-09 2011-02-10 Daimler Ag Method for operating image capturing device for realizing outer mirror functions of motor vehicle, involves monitoring momentary state of cameras in dependence of result of comparison and/or merging of environmental characteristics
DE102012220191A1 (en) 2012-11-06 2014-05-08 Robert Bosch Gmbh Method for supporting driver during transverse guide of vehicle, involves carrying out steering intervention during collision of vehicle with recognized objects, and determining vehicle data through forward facing camera
DE102013203239A1 (en) * 2013-02-27 2014-08-28 Bayerische Motoren Werke Aktiengesellschaft grid-based prediction of the position of an object
DE102013008828A1 (en) 2013-05-24 2014-11-27 Volkswagen Aktiengesellschaft Method for creating a model of an environment of a vehicle and a correspondingly configured driver assistance system
DE102013209764A1 (en) * 2013-05-27 2014-11-27 Robert Bosch Gmbh Support of a driver of a motor vehicle
DE102013211109A1 (en) * 2013-06-14 2014-12-31 Robert Bosch Gmbh Assistance device and method for assisting a driver of the vehicle
JP5991332B2 (en) 2014-02-05 2016-09-14 トヨタ自動車株式会社 Collision avoidance control device
FR3020616B1 (en) * 2014-04-30 2017-10-27 Renault Sas Device for signaling objects to a navigation module of a vehicle equipped with said device
DE102014220687A1 (en) * 2014-10-13 2016-04-14 Continental Automotive Gmbh Communication device for a vehicle and method for communicating
DE102014223363A1 (en) 2014-11-17 2016-05-19 Volkswagen Aktiengesellschaft Method and device for localizing a motor vehicle in a fixed reference card
JP6365385B2 (en) * 2015-04-17 2018-08-01 トヨタ自動車株式会社 Three-dimensional object detection apparatus and three-dimensional object detection method
DE102015107392A1 (en) * 2015-05-12 2016-11-17 Valeo Schalter Und Sensoren Gmbh Method for detecting an object in an environment of a motor vehicle based on fused sensor data, control device, driver assistance system and motor vehicle
DE102015107391A1 (en) * 2015-05-12 2016-11-17 Valeo Schalter Und Sensoren Gmbh Method for controlling a functional device of a motor vehicle on the basis of fused sensor data, control device, driver assistance system and motor vehicle
CN105334514A (en) * 2015-10-19 2016-02-17 上海无线电设备研究所 Tramcar radar video compound early warning crashproof system and method
DE102016205661A1 (en) * 2016-04-06 2017-10-12 Continental Teves Ag & Co. Ohg Method for avoiding a collision and vehicle
DE102016212716A1 (en) * 2016-07-13 2018-01-18 Conti Temic Microelectronic Gmbh Control device and method
FR3056530B1 (en) * 2016-09-29 2019-07-12 Valeo Schalter Und Sensoren Gmbh Obstruction detection by fusion of objects for motor vehicle
FR3056531B1 (en) * 2016-09-29 2019-07-12 Valeo Schalter Und Sensoren Gmbh Obstacle detection for motor vehicle
DE102016220581A1 (en) * 2016-10-20 2018-04-26 Continental Teves Ag & Co. Ohg Method and device for determining a environmental model
DE102016122193A1 (en) * 2016-11-18 2018-05-24 Valeo Schalter Und Sensoren Gmbh Function monitoring of a sensor device of a motor vehicle
US20180203100A1 (en) * 2017-01-19 2018-07-19 Honeywell International Inc. Quality metric for ranging sensor in a degraded visual environment for a situational awareness system
FR3062836B1 (en) * 2017-02-10 2019-04-26 Valeo Schalter Und Sensoren Gmbh Method and system for determining a trust index associated with an object of an environment of a vehicle
DE102017210830A1 (en) * 2017-06-27 2018-12-27 Bayerische Motoren Werke Aktiengesellschaft Create and use an edge development map
DE102017116017A1 (en) 2017-07-17 2019-01-17 Valeo Schalter Und Sensoren Gmbh An automotive sensor device having a plurality of sensor units and a plurality of neural networks for generating a combined representation of an environment
DE102017116016A1 (en) 2017-07-17 2019-01-17 Valeo Schalter Und Sensoren Gmbh A motor vehicle sensor device having a plurality of sensor units and a neural network for generating an integrated representation of an environment
CN111032476A (en) 2017-08-10 2020-04-17 西门子交通有限公司 Regulation of mileage measurement parameters in a sensor-controlled manner as a function of weather conditions
DE102017220269A1 (en) * 2017-11-14 2019-05-16 Continental Automotive Gmbh Driver assistance system, vehicle and method for tracking objects
DE102017223325A1 (en) 2017-12-20 2019-06-27 Conti Temic Microelectronic Gmbh Method and device for merging measurements from different sources of information
DE102018205248A1 (en) * 2018-04-09 2019-10-10 Bayerische Motoren Werke Aktiengesellschaft Fusion system for the fusion of environment information for a motor vehicle
DE102018004125A1 (en) 2018-05-23 2018-11-22 Daimler Ag Method for analyzing an object in an environment of a motor vehicle
DE102018118666A1 (en) * 2018-08-01 2020-02-06 Carl Zeiss Ag Sensor fusion with mutual correspondence analysis of sensor data
DE102018220024B3 (en) * 2018-11-22 2020-03-12 Audi Ag Method for fusing sensor data from several sensors and fusion device for fusing sensor data from several sensors

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0913751A1 (en) 1997-11-03 1999-05-06 Volkswagen Aktiengesellschaft Autonomous vehicle and guiding method for an autonomous vehicle
DE69522486T2 (en) 1994-03-30 2002-04-18 Onera (Off Nat Aerospatiale) Procedure for tracking moving bodies
DE102005009146A1 (en) 2005-03-01 2006-09-21 Robert Bosch Gmbh Driver assistance system with several assistance functions
DE102006035207A1 (en) 2006-07-29 2008-01-31 Volkswagen Ag Object e.g. traffic-relevant object, determining device for vehicle, has evaluation device determining and supplying threshold values depending on supplied value, and radar device verifying whether object is provided in position

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69522486T2 (en) 1994-03-30 2002-04-18 Onera (Off Nat Aerospatiale) Procedure for tracking moving bodies
EP0913751A1 (en) 1997-11-03 1999-05-06 Volkswagen Aktiengesellschaft Autonomous vehicle and guiding method for an autonomous vehicle
DE102005009146A1 (en) 2005-03-01 2006-09-21 Robert Bosch Gmbh Driver assistance system with several assistance functions
DE102006035207A1 (en) 2006-07-29 2008-01-31 Volkswagen Ag Object e.g. traffic-relevant object, determining device for vehicle, has evaluation device determining and supplying threshold values depending on supplied value, and radar device verifying whether object is provided in position

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102614003B (en) * 2011-01-31 2015-10-28 柯惠Lp公司 For lock cam driver and the clamp assemblies of applicator

Also Published As

Publication number Publication date
DE102009006113A1 (en) 2009-09-10

Similar Documents

Publication Publication Date Title
US10423847B2 (en) Predicting vehicle movements based on driver body language
US9652980B2 (en) Enhanced clear path detection in the presence of traffic infrastructure indicator
CN104554258B (en) Using the path planning of the avoidance steering operation of virtual potential field technology
CN106485949B (en) The sensor of video camera and V2V data for vehicle merges
US10203409B2 (en) Method and device for the localization of a vehicle from a fixed reference map
CN104573646B (en) Chinese herbaceous peony pedestrian detection method and system based on laser radar and binocular camera
CN107031508B (en) Back-sight visual system and its application method for vehicle
Laugier et al. Probabilistic analysis of dynamic scenes and collision risks assessment to improve driving safety
US8615109B2 (en) Moving object trajectory estimating device
US10081308B2 (en) Image-based vehicle detection and distance measuring method and apparatus
CN101902598B (en) Recommended following distance on full-windshield head-up display
US9453941B2 (en) Road surface reflectivity detection by lidar sensor
US10466714B2 (en) Depth map estimation with stereo images
US8160811B2 (en) Method and system to estimate driving risk based on a hierarchical index of driving
JP5206752B2 (en) Driving environment recognition device
EP1806595B1 (en) Estimating distance to an object using a sequence of images recorded by a monocular camera
US9734390B2 (en) Method and device for classifying a behavior of a pedestrian when crossing a roadway of a vehicle as well as passenger protection system of a vehicle
EP2523174B1 (en) Vision based night-time rear collision warning system, controller, and method of operating the same
CN101860702B (en) Driver drowsy alert on full-windshield head-up display
US8175331B2 (en) Vehicle surroundings monitoring apparatus, method, and program
JP5407764B2 (en) Driving assistance device
US9074906B2 (en) Road shape recognition device
Jansson Collision Avoidance Theory: With application to automotive collision mitigation
Sparbert et al. Lane detection and street type classification using laser range images
Aufrère et al. Perception for collision avoidance and autonomous driving

Legal Events

Date Code Title Description
OR8 Request for search as to paragraph 43 lit. 1 sentence 1 patent law
R163 Identified publications notified
R163 Identified publications notified

Effective date: 20130802

R012 Request for examination validly filed
R016 Response to examination communication
R018 Grant decision by examination section/examining division
R020 Patent grant now final
R082 Change of representative