EP1412773A1 - Procede et dispositif d'echange et de traitement de donnees - Google Patents

Procede et dispositif d'echange et de traitement de donnees

Info

Publication number
EP1412773A1
EP1412773A1 EP02735078A EP02735078A EP1412773A1 EP 1412773 A1 EP1412773 A1 EP 1412773A1 EP 02735078 A EP02735078 A EP 02735078A EP 02735078 A EP02735078 A EP 02735078A EP 1412773 A1 EP1412773 A1 EP 1412773A1
Authority
EP
European Patent Office
Prior art keywords
sensor
fusion
data
objects
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP02735078A
Other languages
German (de)
English (en)
Inventor
Albrecht Klotz
Werner Uhler
Martin Staempfle
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP1412773A1 publication Critical patent/EP1412773A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/932Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • ACC Adaptive Cruise Control
  • the method according to the invention and the device according to the invention with the features of the independent claims has the advantage that the detection performance, i.e. the quality of the sensor signals, the detection rate and the response, of individual sensors or of sensor clusters is improved, the false alarm rate is reduced and the cases of sensor failure or sensor blindness can be diagnosed more easily and reliably.
  • a central processing unit for example an information platform IP or a Sensor data fusion unit SDF object information processed and prepared by the individual sensors and distributed to the individual sensors.
  • information is specifically distributed from the central processing unit back to the individual sensors or to the sensor clusters.
  • This includes, for example, the temporal assignment of the data of the fusion objects to the sensor objects and, conversely, the identification of the same objects in the processing unit and in the individual sensors as well as a prediction of the object movements.
  • the information flowing back to the individual sensors for example the information that an object threatens to enter the detection range of another sensor, is used by the sensors for preconditioning - for example for lowering detection thresholds and / or for initializing filter parameters - a total of one achieved higher detection performance and detection security as well as an improved response of the object detection.
  • the detection areas of different sensors overlap, use can be made of the different quality of individual, compatible sensor signals, for example by using the generally more accurate resolution of the lateral deposit when object detection of a video sensor to support the angular position of the same object detected by a 77 GHz radar becomes.
  • the higher level of networked data exchange can be used to reduce the false alarm rate of individual sensors and to help diagnose and interpret sensor failures or sensor blindness.
  • the processed and condensed information about objects in the vehicle environment can include driving functions, such as
  • Vehicle guidance systems or vehicle safety systems are made available.
  • an algorithmic method which allows current objects, such as sensor objects, to be assigned to historical objects, for example fusion objects or so-called “tracks", ie histories of measured values.
  • this assignment is referred to as a data association.
  • Further processing steps of the algorithmic method according to the invention include the steps of merging fusion objects, which is also referred to below as merging, and of generating new fusion objects, in particular to take object hypotheses into account. These tasks are carried out with high efficiency using the described method.
  • a computational effort is necessary which is proportional to the product n * m, where n denotes the number of fusion objects and where m denotes the number of sensor objects.
  • the computing effort for the merging step is proportional to n * n. Furthermore, it is advantageous according to the invention that the method according to the invention is carried out with a delayed decision logic, which allows the In the event of a conflict, the decision as to which measurement object, ie sensor object, to which object hypothesis is assigned can only be finally decided in subsequent measurement cycles.
  • the processing steps of the method according to the invention make it possible to achieve the above-mentioned goals of more comprehensive, more reliable, faster and on average higher quality object information and the tracking and identification of objects through the different detection areas of the sensors. '
  • FIG. 1 shows a system for processing sensor data
  • FIG. 2 shows a partial aspect of object representation in a system according to the invention
  • FIG. 3 shows a flow chart for the data exchange between a processing unit according to the invention and sensors according to the invention
  • Figure 4 is a structogram of the invention
  • FIG. 5 shows a first example of a measurement situation
  • Figure 6 shows the representation of the data association for the first
  • FIG. 7 shows the fusion for the first example
  • FIG. 8 shows the representation of merging for the first example
  • FIG. 9 shows a second example of a measurement situation
  • SUBSTITUTE SHEET Figure 10 shows the data association for the second
  • FIG. 11 shows the representation of the fusion for the second example
  • Figure 12 shows the merging for the second
  • Figure 13 shows the representation of bridging a
  • FIG. 14 shows the system for processing sensor data in a modified representation
  • Figure 15 is a processing diagram for determining a
  • Figure 16 shows a sub-algorithm for determining a
  • Figure 17 is a diagram of a plausibility management.
  • the system generally comprises a plurality of sensors, for example a first sensor 100 shown in FIG. 1, a second sensor 200 and a third sensor 300.
  • Each of the sensors 100, 200, 300 is connected to a bus system which is identified by the reference symbol B is marked.
  • the bus system B is also connected to a processing unit 400.
  • the bus system B is intended to ensure that the exchange of data between each of the sensors 100, 200, 300 and the processing unit 400 is ensured or can be carried out simply and quickly and that a certain bandwidth is available bidirectionally for data exchange.
  • the bus system B is provided according to the invention in particular as a CA bus (controller area network bus). According to the invention, however, it is also possible to use any other bus architecture.
  • CA bus controller area network bus
  • processing unit 400 is referred to in particular as sensor data fusion unit 400 or as information platform 400.
  • the sensors 100, 200, 300 are, in particular, individual sensors or also entire sensor clusters.
  • FIG. 2 shows a system according to the invention with, for example, two sensors, namely the first sensor 100 and the second sensor 200, as well as with the processing unit 400 and the bus B connecting these units.
  • FIG. 2 shows a first object 10 and a second object 20 in the outer environment of the motor vehicle (not shown), in which the first sensor 100 and the second sensor 200, as well as the processing unit 400 and the bus system B, are installed.
  • video sensors radar sensors - for example 77 GHz long-range sensors, 77 GHz medium-range sensors, 24 GHz short-range sensors -, lidar sensors, laser sensors or ultrasonic sensors Find use.
  • the interaction is, for example, that such a video sensor records the optical image of the vehicle surroundings and analyzes it in such a way that objects 10, 20 in the Environment of the motor vehicle are recognizable.
  • radar sensors it is according to the invention, for example, that the radar sensor emits a radar wave and senses the reflection wave reflected by the environment and objects 10, 20 can be recognized therefrom.
  • ultrasonic sensors for example, as sensors 100, 200.
  • the various connections of the sensors 100, 200 to the environment of the vehicle are shown in FIG. 2 with the arrows provided with the reference numerals 30 and 31, respectively, the arrows provided with the reference number 30 from the first sensor 100 to the first object 10 and to the second object 20 or reject and the arrows 31 also point back and forth from the second sensor 200 to the first object 10 and the second object 20.
  • a certain preprocessing of the data stream is already carried out in the sensors 100, 200, which corresponds to the continuous or pulsed measurement or sensing of the vehicle surroundings. This is usually technically necessary because it enables data to be reduced and thus transmission bandwidth on bus system B can be saved.
  • the preprocessing in the sensors 100, 200 according to the invention consists in particular in that, based on the objects 10, 20, so-called data objects are generated by the sensors 100, 200 in the real physical environment of the vehicle equipped with the system according to the invention.
  • a curly bracket is shown to the left of the box representing the first sensor 100, which includes the reference numerals 110 and 120.
  • the reference numerals 110, 120 stand for such data objects generated by the first sensor 100, which were generated based on the objects, for example the first object 10 and the second object 20 in the real environment of the vehicle.
  • data objects which are generated by one of the sensors 100, 200, can be understood as a set of information that represent a real object located in the environment of the vehicle - or even just one that the sensor incorrectly recognized but did not exist Object - belong. Since the sensors 100, 200 generate such data objects and forward them to the processing unit 400, the data objects generated by the sensors 100, 200, 300 are also called sensor objects for simplicity. In the following, sensor objects are also called sensor data in general terms. In FIG.
  • a first sensor object 110 and a second sensor object 120 are shown with a double-sided arrow next to the first sensor 100 in the curly bracket already mentioned.
  • a third sensor object 210 and a fourth sensor object 220 are shown within a further curly bracket in addition to the second sensor 200.
  • the first and second sensor objects 110, 120 were generated in the example by the first sensor 100 and the third and fourth sensor objects 210, 220 were generated in the example by the second sensor 200.
  • the sensor objects 110, 120, 210, 220 can also include parts of real objects (for example in the case of video sensors, edges or parts of object outlines).
  • the information about the sensor objects 110, 120, 210, 220 is passed on to the processing unit 400 as a result of the processing of the data measured by the sensors 100, 200 via the bus system B.
  • the processing unit 400 carries out a further data reduction and generates in a manner similar to the sensors 100, 200 likewise so-called data objects, which, however, are called fusion objects or also generalizing fusion data to differentiate them from the sensor objects.
  • a curly bracket with a first fusion object 410 and a second fusion object 420 is shown in FIG. 2 in addition to the processing unit 400.
  • FIG. 2 also shows a controller 500 which is connected to the processing unit 400. According to the invention, it makes no difference whether the controller 500 is connected directly to the processing unit 400 or whether the controller 500 is connected to the processing unit 400 via the bus system B.
  • the first of these two alternatives is shown by way of example in FIG.
  • fusion object is derived in particular from the synonymous name of the processing unit 400 as the sensor data fusion unit SDF 400. This is because the sensor objects supplied by the sensors 100, 200 are "fused" in the processing unit 400.
  • FIG. 3 shows a flow diagram for the data exchange between the sensor data fusion or the processing unit 400 and a sensor system shown as an example, consisting of a first sensor 100 as FMCW radar and a second sensor 200 as video sensor.
  • the flowchart in FIG. 3 represents a repetitive sequence of the environment detection by the sensors 100, 200. The state is shown after a possibly provided initialization, ie the "steady state" is shown. The explanation of the Figure 3 therefore begins arbitrarily at one point in the control loop.
  • the first sensor 100 is provided as a radar sensor and supplies the processing unit 400 with a first sensor object 110 and a second sensor object 120. This is shown in FIG. 3 in such a way that an arrow pointing from the first sensor 100 to the processing unit 400 contains a box (including) the captions 110 and 120 includes. Furthermore, the first sensor 100 transmits to the processing unit 400 a first time information, which is provided with the reference symbol 99. Since the first time information 99 is also transmitted to the processing unit 400, there is also the reference number 99 in the box which is enclosed by the arrow between the first sensor 100 and the processing unit 400. In a corresponding manner, the second sensor 200, which is designed as a video sensor in the example, sends a third sensor object 210 and a fourth sensor object 220 together with second time information 199 to the sensor
  • Processing unit 400 This is represented analogously to the arrow between the first sensor 100 and the processing unit 400 by an arrow between the second sensor 200 and the processing unit 400, which comprises a box which is labeled with the reference numerals 210, 220 and 199.
  • the first time information 99 and the second time information 199 are provided in particular as time stamps which are generated by the sensors 100, 200 and are transmitted to the processing unit 400. It is provided according to the invention for a first variant of the invention that such time stamps are either generated by the sensors 100, 200 in an "absolute manner" so that they do not relate to a reference point. However, it is According to the invention for a second variant of the invention, it is also provided that the processing unit 400 sends a "central time information" to the sensors 100, 200 at regular or irregular time intervals, so that the first and the second time information 99, 199 as "relative time value" in relation to the central time information of the processing unit 400. The central time information from the processing unit 400 to the sensors 100, 200 is shown in FIG. 3 as a box with the reference number 399.
  • the fusion objects are generated in the processing unit 400.
  • a first fusion object 410, a second fusion object 420 and a third fusion object 430 are shown in FIG. 3 by way of example.
  • the processing unit 400 has resources for managing a plurality of fusion objects 410, 420, 430.
  • An essential aspect of a fusion object 410, 420, 430 is a list, or a set, which is the list or the set of sensor objects that are in the merger object is included or assigned to the merger object.
  • the sensors 100, 200, 300 that detect the first object 10 would each send a sensor object to the processing unit 400 that belongs to the first object 10 or represents it.
  • this information ie sensor objects, supplied by various sensors, would now be related to the first object 10 addressed list of sensor objects summarized in a fusion object z.
  • the first fusion object 410 includes the first sensor object 110 and the fourth sensor object 220, i.e. Due to its coordinates and its speed, the first sensor object 110 detected by the radar sensor 100 corresponds to the fourth sensor object 220 detected by the video sensor and is therefore combined in the processing unit 400 to form the first fusion object 410.
  • the second fusion object 420 only comprises the second sensor object 120 and the third fusion object 430 in the example comprises the third sensor object 210.
  • the fusion objects 410, 420, 430 include so-called attributes, the attributes of the fusion objects 410, 420, 430 including, among other things, the physical properties and associated quality measures of the merged object data and a time stamp which assigns the data or attributes to a fusion cycle.
  • a fusion cycle corresponds to the time cycle which is provided in the processing unit 400 to keep the data consistent and to update it.
  • Another group of attributes of the fusion objects 410, 420, 430 describes the assignment of the sensor objects to the fusion objects, for example via a sensor object identification (for example a number or a name) and a relative time stamp that indicates the time of the Contains original measurement relative to the central time stamp or the central time information 399 from the processing unit 400.
  • the time information 99, 199 and the central time information 399 are exchanged in each measurement or fusion cycle or also fusion cycle, regardless of whether object data, ie information, is also included with the time information 99, 199 or the central time information 399 that relate to a sensor object or a fusion object.
  • object data ie information
  • the central time information 399 that relate to a sensor object or a fusion object.
  • the basis for this is provided on the one hand by the a priori knowledge, for example about the quality of the respective sensor data and the detection ranges of the sensors, on the other hand the current quality information, which relates to a measurement in a sensor 100, 200 and from the sensor with the data record of the sensor object the
  • Processing unit 400 is transmitted. Furthermore, the history of the fusion object data is a basis for deciding which information and data flow back to which sensor 100, 200, 300. Taking the signal quality into account requires the compatibility of the data and the transformation to a uniform coordinate system.
  • the processing unit 400 carries out a quality check.
  • This is illustrated in FIG. 3 by the boxes labeled with the reference symbols 402 and 403, to which arrows point in each case starting from the box labeled with the reference symbol 410.
  • the quality check with the reference symbol 402 it is determined, for example, that the y coordinate of the real object in the vehicle surroundings, which is represented by the first fusion object 410, is measured better by the second sensor, ie a video sensor, than by the first sensor, ie the radar sensor.
  • the value for the y-coordinate of the first fusion object 410 measured by the second sensor 200 is forwarded to the first sensor 100 in order to enable the first sensor 100 to measure the y-coordinate of the first fusion object 410 more precisely, or generally further assigned first sensor object 110 to move.
  • the processing unit additionally identifies the relevant sensor object in addition to the value of the y component Example considered case of the first sensor object 110 transmitted.
  • the quality check or quality check represented by reference numeral 403 the speed component in the x direction is transmitted to the second sensor 200, ie the video sensor, because the value for the
  • Speed component in the x direction which is supplied by the radar sensor, ie the first sensor 100, is usually (a priori knowledge) better than the value for the x component of the speed, which is provided by the second sensor 200, ie the video sensor is measured.
  • the transmission of the x component of the speed to the second sensor 200 is in turn carried out together with the transmission of an identification for the corresponding sensor object, ie for the fourth sensor object 220 in the case considered in the example.
  • the reference symbol 111 stands for the identification of the first sensor object 110
  • the reference symbol 221 for the identification of the fourth sensor object 220
  • the arrow starting from the quality check 402 via the box with the reference symbol 111 to the box with the reference symbol 406 is used for the illustration the transmission of the identification 111 of the first sensor object 110 to the first sensor 100.
  • the arrow also serves, starting from the
  • the second fusion object 420 only comprises the second sensor object 120. Therefore, the identification 121 of the second sensor object 120 is returned to the first sensor 100, which is represented by an arrow from the box labeled with the reference number 420 to the box labeled with the reference number 406, which a box with the reference number 121 for the Identification of the second sensor object 120 includes.
  • a method step which is represented by a box provided with the reference symbol 404, whether the second fusion object 420 penetrates or threatens to enter the detection range of the second sensor 200 or Not.
  • FIG. 3 has a first output, which is symbolized by an arrow, which ends with a dash to symbolize that the processing can terminate at this point.
  • the processing stops when it is determined that the second fusion object 420 does not penetrate into the detection range of the second sensor 200. If this is otherwise the case (ie the second fusion object 420 penetrates or threatens to enter the detection range of the second sensor 200), an identification for a further sensor object to be generated is sent to the second sensor 200. This identification for a further sensor object, which is not yet contained in the sensor object list that is sent from the second sensor 200 to the processing unit, is shown in FIG. 3 with the reference symbol 231.
  • the coordinate data belonging to the second fusion object 420 are sent to the second sensor 200.
  • This coordinate data includes, in particular, the x and y components of the distance and the x and y components of the speed of the second fusion object 420.
  • the identification 211 for the third sensor object 210 becomes sent to the second sensor 200 and in a function block 405 the decision is made as to whether the third sensor object 210 or the third fusion object 430 will penetrate into the detection range of the first sensor 100 or not.
  • an identification 131 for a (not shown) sixth sensor object is therefore sent to the first sensor 100 together with the coordinate data of the sixth sensor object in the event of intrusion into the detection range of the first sensor 100.
  • the data exchange between the sensors 100, 200, 300 and the processing unit 400 basically takes place from the following points of view:
  • Objects are found in the sensor, which is done by exchanging identifications 111, 221, 121, 211 of sensor objects - these identifications are also called tracking numbers - if there is no overlap of sensor areas.
  • an improvement in the response behavior of sensors is further brought about in that object information is then sent to a sensor, for example the first sensor 100 - which has not yet detected an object - for example the first object 10 - is forwarded when the first object 10 has been detected by another sensor - for example the second sensor 200 - and the predicted movement of the first object 10 suggests that the first object 10 already exists or is soon to penetrate into the detection range of the first sensor 100.
  • the first sensor 100 can then recognize the first object 10 entering its detection area more quickly and can initialize internal parameters, such as the filter settings, in a targeted manner. This is referred to as preconditioning of the first sensor 100. This is illustrated in FIG. 3 for the second and third fusion objects 420, 430 by the decision-making process in the boxes identified by reference numerals 404 and 405, respectively.
  • an increase in the detection power is further brought about by the fact that objects which are in the common detection range of several sensors and which are not detected by all sensors are reported to sensors which do not detect these objects. This is done with the aim of specifically influencing and controlling the sensor attention of such a sensor that does not detect the object (attention control), for example by lowering threshold values, by refining the discretization, etc.
  • the detection performance of the individual sensors can thus be increased.
  • the information to be transmitted from the control center 400 to the sensors can in particular be given a prioritization, so that if the bandwidth which can be transmitted via the bus system B is limited, the data volume is reduced in such a way that only the most highly prioritized, information to be transmitted from the processing unit 400 to the sensors 100, 200, 300 is transmitted via the bus system B. It is therefore provided in FIG. 3 to provide the information flowing to the sensors 100, 200 with a prioritization unit, which is provided with the reference symbol 406 with regard to the information to be transmitted to the first sensor 100 and with respect to the information to the second sensor 200 transmitting information is provided with the reference numeral 407.
  • FIG. 4 shows a structure diagram of the evaluation algorithm of sensor objects.
  • the algorithm is carried out in particular in the processing unit 400.
  • the invention also provides for various processing steps of the evaluation algorithm to be carried out in distributed systems.
  • the sensors 100, 200, 300 are also shown in FIG.
  • the sensors 100, 200, 300 deliver first data, which are represented in FIG. 4 by an arrow with the reference symbol 409, to the processing unit 400.
  • the first data 409 are in the first processing step 408 synchronized.
  • synchronization of the first data 409 to a base clock of the processing unit 400 is carried out.
  • the details of this synchronization step are shown in particular in the simultaneously filed German patent application by the same applicant, which bears the title "Method for synchronization and device".
  • the first processing step 408, ie the synchronization, supplies, based on the first data 409, in particular as sensor objects 110, 120, 210, 220, time-synchronized second data 411. This is based on an arrow with the reference symbol 411, starting from the first processing step 408 shown.
  • data from various environment sensors 100, 200, 300 - in particular in the processing unit - are first transformed in the first processing step 408 to a uniform coordinate system (data alignment) and synchronized in time.
  • a current sensor object list with measurement objects is created for each sensor.
  • the second data 411 are sent to a second processing step 419 or made available to it.
  • the second processing step 419 is also referred to below as association step 419.
  • the so-called data association is carried out, ie an attempt is made to assign the measurement objects to one or more existing fusion objects. If no assignment is made, a new fusion object is generated.
  • the result of the assignment or association can be recorded, for example, in an association matrix 422 (described further below) in which all possible object hypotheses (assignment of measured objects to existing objects or object hypotheses) are registered.
  • association matrix 422 described further below
  • the third processing step 429 is also referred to below as the fusion step 429.
  • the third data 421 comprise in particular the so-called association matrix 422.
  • fourth data 431 and fifth data 432 are generated.
  • the association matrix is processed line by line and new fusion objects or fusion data are formed by forming weighted average values of the relevant object attributes. Objects or object hypotheses that are no longer measured by a sensor for a certain period of time are rejected.
  • the fourth data 431 include, in particular, a so-called fusion object list 431 and are made available to the second processing step 419.
  • the fifth data 432 which in particular comprises so-called tracking data 432 and is produced by the third processing step 429, ie the data fusion 429, are made available both to the second processing step 419 and to a fifth processing step 450.
  • the fourth data 431, ie the fusion object list 431 is also made available to a fourth processing step 440, a so-called merging processing step 440, which is also referred to below as merging step 440.
  • the newly calculated fusion objects which lie within a capture range (only shown in FIG. 5) are fused into one object.
  • the basis for this - similar to the association - is a gating process.
  • the fourth processing step 440 produces sixth data 441, which is available to the fifth processing step 450 are set, with the fifth processing step 450 performing temporal filtering of the object data, for example using low-pass or Cayman filters, with smoothing being carried out in the time dimension. Furthermore, in the fifth processing step 450, a prediction is made for a further time step.
  • the sixth data 441 include in particular a so-called compressed fusion object list.
  • seventh data 451 are generated which are made available to a sixth processing step 460.
  • the sixth processing step 460 is also referred to below as evaluation step 460.
  • the seventh data 451 include in particular a filtered and predicted fusion object list.
  • a plausibility check of the objects and an object selection are carried out.
  • the plausibility check is provided according to the invention in particular in a function-specific manner.
  • eighth data 461 are generated or selected for forwarding to the sensors 100, 200, 300.
  • the eighth data 461 are also referred to below as feedback data 461.
  • in the sixth processing step 460 there is an optional data exchange with at least one downstream information, comfort or security function 500 which uses the environmental information as input variables.
  • This downstream function corresponds to controller 500 or is localized in controller 500.
  • it is provided in particular to carry out a prioritization of the fusion objects in the sixth processing step 460, as is described below.
  • FIG. 5 shows a first example of a measurement situation.
  • a coordinate system with an x-axis and a y-axis is shown, which represents a coordinate system of the vehicle environment common to several sensors.
  • the x coordinate and the y coordinate represent, for example, location coordinates or angle coordinates or speed coordinates.
  • the first sensor 100 "sees" an object - for example the first object 10 - at a first time step (not shown) at a first position, which is identified by the reference symbol 101.
  • the object recognized by the first sensor 100 at 101 in the first time step is expected in a subsequent second time step (likewise not shown) at a position designated by the reference symbol 102 (prediction).
  • an object is recognized by the second sensor 200 in the first time step at the position 201 designated by the reference number 201 and expected in the second time step at the position designated by the reference number 202.
  • the third sensor 300 recognizes two objects: an object is recognized at the first point in time at the position designated by the reference number 301 and expected at the second point in time at the position designated by the reference number 302, and another object is recognized at the first point in time at the number 303 identified position and expected at the second time at the position designated by reference numeral 304.
  • FIG. 5 shows a first association gate with the reference symbol 423 and a second association gate with the reference symbol 424.
  • FIG. 6 shows the association matrix for the first example
  • FIG. 7 shows the representation of the matrix after the fusion for the first example
  • FIG. 8 shows the representation of the matrix after merging for the first example.
  • FIG. 6 which represents the association matrix 422 for the first example, a line is provided for the fusion object, which is assumed at the position denoted by the reference symbol 476 in the second time step.
  • the one with the reference symbol is assumed at the position denoted by the reference symbol 476 in the second time step.
  • the association matrix 422 establishes a connection between a fusion object and one or more sensor objects. In FIG. 6, only the reference numerals are given to represent this connection, which correspond to the positions of the corresponding objects.
  • association matrix 422 a global, statistical distance measure, for example the Mahalanobis standard, or also signal-specific distance measures d between each sensor object and each fusion object is calculated, for example by means of the Euclidean distance.
  • the distance dimensions are generally defined in the phase space, ie they can also contain speed data in addition to the position data.
  • FIG. 5 and FIG. 9 only include position data.
  • a possible assignment (ie an object hypothesis) between the sensor object and the fusion object is assumed if the distance d is less than a threshold value (gate or association gate 423, 424) or can be assigned to the object with a confidence by means of a statistical test or all distances d are smaller than the threshold values.
  • a quality measure for the quality of the association can be assigned, which, for example, indicates how much smaller the distance d is than the threshold value.
  • the threshold values can be different for each signal or for each object.
  • the thresholds may vary depending on sizes like that Measurement errors of the individual measurement and the fusion object, the distance of the objects from your own vehicle, the relative speed of the objects, the assignment to the driving tube of your own vehicle, etc.
  • an entry is made in the association matrix.
  • the rows in the association matrix correspond, for example, to the index of the fusion object in the fusion object list and the columns correspond, for example, to sensors 100, 200, 300 and the sensor type.
  • An entry in the association matrix consists, for example, of the distance d, a value for the quality of the association, the object number of the sensor object, a cross reference to the point at which the sensor object can be found in the sensor-specific object list, and a "occupied" -Brand. If a position in the association matrix is already occupied, the sensor object is appended to the fusion object list and the counter is incremented for the number of current fusion objects.
  • the described method deliberately includes a multiple assignment of sensor objects to existing fusion objects (multi-hypotheses).
  • sorted fusion object lists or sensor object lists can be used, the processing of which can be interrupted (a kind of "gating") if a threshold is exceeded (for example with regard to the distance), after which no more assignment will certainly take place.
  • interrupted a kind of "gating"
  • FO n includes SO
  • F0 n includes SOk
  • FIG. 5 shows a merging gate 442, which is also referred to below as catch area 442.
  • the fusion module ie the third processing step 429, new fusion object attributes are calculated for the objects in the fusion list. This is done by line by line processing Association matrix 422, in which all the necessary attributes are present, which ensure the weighting of the new fusion objects and access to the original measurement data. If several sensors contribute to a fusion object, a weighting of the individual signals of the sensors is used. The weights are determined from the quality of the association and the sensor data quality.
  • Another fusion object attribute, the object plausibility is determined in the fusion module, ie the third processing step.
  • the object plausibility is incremented, for example, as soon as a fusion object has been confirmed by at least one sensor object. The increment depends on the number of associated sensor objects and the ratio of the cycle time of the sensor fusion to the cycle time of a sensor cycle (ie the topicality of the sensor object data). If a fusion object is not confirmed in the current fusion cycle, the object plausibility is decremented. The increment and decrement of the object plausibility can also depend on the current value of the object plausibility.
  • the fusion object list is processed again in order to find and merge objects that lie within a capture range.
  • assignments of sensor objects of the same sensor type to a fusion object are implicitly taken into account, and object hypotheses that differ only slightly due to the merged data are combined to form a fusion object.
  • the procedure is similar to that of the association described.
  • the merging gates 442 are generally smaller than the association gates 423, 424 and each two objects that lie within a capture area 442 are immediately fused (merged or "merged") into a new object. When objects are merged, the respective signal quality can be used to weight the individual signals.
  • association matrix 422 (reduced by the sensor objects), which is shown in FIG. 7 and FIG. 8, comprises the same number of objects, namely two, because the recognized objects do not come close enough at the positions designated by the reference numerals 476 and 473 to fit within a capture area 442.
  • a summary of the objects in the fourth processing step 440 therefore did not take place in the first example.
  • FIG. 9 shows a second example of a measurement situation.
  • a coordinate system with an x-axis and a y-axis is shown again.
  • the x coordinate and the y coordinate in turn represent, for example, location coordinates or angle coordinates or speed coordinates.
  • the first sensor 100 "sees" an object - for example the first object 10 - at a first (not shown) time step at a first position, which is identified by the reference symbol 111.
  • the object recognized by the first sensor 100 at 101 in the first time step is expected in a subsequent second time step (likewise not shown) at a position designated by the reference symbol 112 (prediction).
  • an object from the second sensor 200 becomes the first Time step recognized at the position designated by reference number 211 and expected at the second time step at the position designated by reference number 212.
  • the third sensor 300 in turn detects two objects: an object is recognized at the first point in time at the position designated by the reference symbol 311 and expected at the second point in time at the position designated by the reference symbol 312, and another object is recognized at the first point in time at the position with the reference symbol 313 identified position and expected at the second time at the position designated by reference numeral 314.
  • the first association gate is again designated with the reference symbol 423 and a second association gate with the reference symbol 424.
  • FIG. 10 shows the association matrix 422 for the second example
  • FIG. 11 shows the representation of the matrix after the fusion for the second example
  • FIG. 12 shows the representation of the matrix after merging for the second example. Since the expected positions provided with the reference numerals 112, 212 and 314 for the second time step lie within the first association gate 423, their evaluation leads to the hypothesis that only a single object (due to measurement errors) at (slightly) different positions from different sensors was recognized. Therefore, the positions provided with the reference numerals 112, 212, 314 or the corresponding sensor objects are assigned to a fusion object which was measured at the first time step at the position designated with the reference number 484, and for the second time step with the reference number
  • Position 485 is expected (prediction) and for which in the second time step the one with the reference symbol
  • FIG. 10 represents the association matrix 422 for the second example, there is one row for the Fusion object is provided, which is assumed in the second time step at the position designated by the reference symbol 485.
  • the position provided with the reference symbol 312 or the corresponding sensor object is assigned to a fusion object which was measured in the first time step at the position designated with the reference symbol 481, for which the position with the reference symbol 482 is expected in the second time step (prediction) and for which in the second time step the position denoted by reference numeral 483 is "measured” or calculated.
  • a line is therefore provided in FIG. 10 for the fusion object, which is assumed in the second time step at the position denoted by reference numeral 482.
  • association matrix 422 also contains further information.
  • additional lines second and third lines of the association matrix shown in FIG. 10 were inserted in association step 419, because it could not be clearly determined whether the ones identified by the reference numerals 112 and 312 designated positions represented objects is a single object, or two. This made it necessary to insert a line which is given the reference symbol 489 in FIG. 10 and thus to generate a "hypothetical" fusion object to represent the additional object hypothesis "two separate objects".
  • FIGS. 11 and 12 show the association matrices (again reduced) after the fusion step and after the merging step.
  • the merging step leads to a summary of the three fusion objects still present after the fusion step, because all positions of the objects relate to the second time step (positions that are identified by reference numerals 486 and 483 are within the merging gate 442 also shown in FIG. These fusion objects are thus combined to form the only fusion object which is suspected at the second point in time at the position denoted by reference numeral 487.
  • FIG. 13 shows a typical situation that occurs when several environment sensors or sensors have different detection areas.
  • reference number 1 denotes the device which has various sensors, for example the first sensor 100 (not shown in FIG. 13) and the second sensor 200 (also not shown in FIG. 13).
  • the device 1 is provided in particular as a motor vehicle 1 which has the sensors 100, 200.
  • a first detection area 150 of a sensor for example first sensor 100
  • a second detection area 250 of a sensor for example second sensor 200
  • a first object 10 and a second object 20 are shown in FIG the objects are in particular other motor vehicles.
  • FIG. 13 shows the situation that the first object 10 is in the first detection area 150 and is about to leave it, while the second object 20 is in the second detection area 250 and is also about to leave it.
  • FIG. 13 therefore shows a first exit area 152 and a second exit area 252, the first exit area 152 being the location or represents the area where the first object 10 emerges from the first detection area 150 and the second exit area 252 represents the location or area at which the second object 20 exits the second detection area 252 exits.
  • objects 10, 20 are represented in processing unit 400 as fusion objects, for example as first and second fusion objects 410, 420. It is thus possible according to the invention to "bridge" the detection gap 160 in such a way that the movements of the objects 10, 20 relative to the detection areas 150, 250 are estimated (predicted) and can therefore also be estimated whether and if so when and where one of the Objects 10, 20 which just leave a detection area 150, 250 reenter another (detection area) 150, 250. Therefore, in FIG. 13 are a first entry area 251, which indicates the estimated location where the first object 10 enters the second detection area 250, and a second entry area 151, which indicates the estimated location where the second object 20 enters the first detection area 150 occurs.
  • Processing unit 400 processes the sensor objects from different sensors together and manages them as fusion objects.
  • a data record of a tracked fusion object or an object hypothesis consists among other things of the position, the speed, the acceleration and an object plausibility.
  • the fusion object has already been tracked over a number of detection cycles and the history of the fusion object, ie its "tracking data", in parametric, model-based form, for example using the Cayman filter coefficients, is present.
  • the detection areas 150, 250 of the individual sensors are known and are described in mathematical form. The edges of the detection areas 150, 250 need not be sharply delimited, but can be described by tolerance bands.
  • the plausibility is a measure of the reliability of the object hypothesis, for example a number between zero and one indicates the plausibility and where the value zero for the plausibility corresponds to an improbable (and to be rejected) object hypothesis and where the value one for the plausibility of a very probable Corresponds to the object hypothesis.
  • the plausibility of an object hypothesis ie a fusion object, is recalculated by incrementing or decrementing each cycle of the algorithm.
  • the size of the increment or decrement is variable and essentially determines the life cycle of an object hypothesis. The fact that, for example, the first object 10 leaves the first detection area 150 in FIG.
  • 13 is represented in the fusion object representation within the processing unit 400 in such a way that, for example, the object movement of the first fusion object 410 from its history, ie its tracking data, to is predicted to the next or a later expected measured value and it is determined that the limit of the first detection range 150 has been exceeded. If the predetermined position lies outside the area limits of the first detection area 150, the object 10 or its data representation in the form of the first fusion object 410 will leave the first detection area 150. The possible re-entry into the adjacent second detection area 250 is determined in that, starting from the calculated first exit area 152, the expected trajectory is predicted into the future until an entry into the second detection area 250 is expected is.
  • the first object 10 "de-plausible" with a standard decrement, ie the value of its plausibility is reduced in a predetermined manner. This case is not shown in Figure 13.
  • the maximum period of time is determined that is likely to elapse before a possible entry into the second detection area, which is done, for example, by dividing the longest possible distance in the detection gap 160 by the relative speed.
  • the plausibility decrement can be reduced in such a way that the object or the object hypothesis is not rejected, as long as it is likely to be in the detection gap 160.
  • the calculation of the expected entry area and the plausibility decrement is performed anew each time the algorithm is repeated, taking into account the speed and yaw movement of the vehicle 1. Will that If the first object 10 is not detected at the predetermined location and at the predetermined time in the second detection area 250, the object hypothesis is rejected.
  • the first object 10 is detected at the predetermined location and at a predetermined time in the second detection area 250, it is identified with the old object. Particularly for sensors that do not measure the speed and acceleration of objects directly, this enables improved and faster signal dynamics of the fusion object when it re-enters the second detection area 250, since it is not necessary to make inaccurate estimates for filter initialization, but rather to use better, predicted estimates can. This is because the signal qualities of fusion objects (which also include information about the history of the object) are generally greater than the signal qualities of sensor objects.
  • the downstream driving function which is usually located in the controller 500, can use the improved signal quality of the fusion object when it re-enters the second detection area 250 by reacting faster and more reliably.
  • an exchange of object information is provided between the processing unit 400 and the sensors 100, 200, 300 such that such object information is sent to the sensors 100, 200 , 300 are sent.
  • this sensor can detect the first object faster, more reliably and more precisely.
  • the described bridging algorithm of detection gaps 160 can also be extended with small modifications to problems of object tracking in the case of temporary occlusions.
  • FIG. 14 shows the system for processing sensor data in a modified representation.
  • the system again comprises the first sensor 100, the second sensor 200 and the third sensor 300.
  • Each of the sensors 100, 200, 300 is connected to the bus system B, which is also connected to the processing unit 400.
  • FIG. 14 also shows the first object 10 and the second object 20 in the outer environment of the motor vehicle 1 (not shown in FIG. 14), in which the first sensor 100, the second sensor 200, the third sensor 300, the processing unit 400 and the bus system B are installed.
  • One aspect of the present invention is to determine the object size of objects 10, 20.
  • FIG. 14 shows the length with the reference symbol "L” as examples of object sizes for the first object 10 and the width with the reference symbol "W" for the second object 20.
  • the objects 10, 20 - and thus also their extent - are detected by the sensors 100, 200, 300, as described in principle in FIG. 2.
  • the arrows provided with the reference numerals 30, 31 have been omitted in FIG. 14 for the sake of simplicity.
  • the sensor objects 110, 120, 210, 220 and fusion objects 410, 420 shown in FIG. 2 are omitted in FIG. 14 for the sake of simplicity.
  • the system according to FIG. 14 also works in an analogous manner to the system according to FIG. 2.
  • the processing unit 400 processes the various information provided by the sensors 100, 200, 300, wherein Object hypotheses are generated as fusion objects.
  • Object hypotheses are generated as fusion objects.
  • the goal is the spatial and temporal accumulation of potential size information. Determined object sizes can be used to evaluate and interpret the driving environment.
  • the object size is a reliable attribute of a fusion object, the object size being used in a classification of detected objects.
  • the object size can be forwarded to vehicle guidance systems or driver assistance systems and significantly increases the level of detail of a driving environment detected by sensors.
  • FIG. 15 shows an overall method according to the invention for determining the object size.
  • the calculation of the width of an object is shown as an example.
  • the following situation is shown by way of example in FIG. 15: the first sensor 100 supplies the first sensor object 110 and the second sensor 200 supplies the third sensor object 210 and the fourth sensor object 220.
  • This is represented in FIG. 15 by arrows, an arrow from with the reference symbol 100 designated boxes to the box designated by reference numeral 110 and an arrow from the box designated by reference numeral 200 to the box designated by reference numerals 210 and 220 are shown.
  • the first sensor object 110 ie the box provided with the reference symbol 110, three reference symbols 116, 117, 118 are shown in FIG. 15, each of which represents measured values of the first sensor object 110.
  • reference numeral 116 represents both the transverse offset of the object represented by the first sensor object 110, ie its extent in the y direction, and the quality of the determination of the transverse offset of the object represented by the first sensor object 110.
  • the reference symbol 117 represents both the measured width of the object represented by the first sensor object 110 and the quality of the determination of the width of the object represented by the first sensor object 110, while the reference symbol 118 stands for further measured values and attributes supplied by the first sensor 100 with respect to the first sensor object 110.
  • the third sensor object 210 ie the box provided with the reference symbol 210
  • three reference symbols 216, 217, 218 are shown in FIG. 15, each of which represents measured values of the third sensor object 210.
  • reference numeral 216 represents both the transverse offset of the object represented by the third sensor object 210, ie its extension in the y direction, and the quality of the determination of the transverse offset of the object represented by the third sensor object 210
  • Example shows both the measured width of the object represented by the third sensor object 210 and the quality of the determination of the width of the object represented by the third sensor object 210, while the reference symbol 218 for further measured values and attributes provided by the second sensor 200 with respect to the third sensor object 210 stands.
  • the fourth sensor object 220 ie the box provided with the reference symbol 220, three reference symbols 226, 227, 228 are shown in FIG. 15, each of which represents measured values of the fourth sensor object 220.
  • reference numeral 226 represents both the transverse offset of the object represented by fourth sensor object 220, ie its extension in the y direction, and the quality of the determination of the transverse offset of the object represented by fourth sensor object 220.
  • Reference numeral 227 represents in Example both the measured width of that measured by the fourth sensor object 220 represents the object represented as well as the quality of the determination of the width of the object represented by the fourth sensor object 220, while the reference symbol 228 stands for further measured values and attributes relating to the fourth sensor object 220 supplied by the second sensor 200.
  • an arrow points from the data represented by the reference symbols 116 and 117 to a box provided with the reference symbol 610.
  • an arrow points from the data represented by the reference symbols 216 and 217 to a box provided with the reference symbol 620.
  • an arrow points from the data represented by reference numerals 226 and 227 to a box provided by reference numeral 630.
  • the reference numerals 610, 620 and 630 each stand in an identical manner for a first evaluation algorithm, which evaluates the corresponding data of the first, third and fourth sensor objects 110, 210, 220.
  • the first evaluation algorithm is shown in FIG.
  • the first input 601 in the "copy" of the first evaluation algorithm designated by reference number 610 600 is represented by the arrow between the data 116 and the box 610 and the second input 602 in the "copy” of the first evaluation algorithm 600, designated by the reference number 610, is represented by the arrow between the Data 117 and box 610 are shown.
  • 216 corresponds to the first input of 620 and 217 to the second input of 620 and 226 to the first input of 630 and 227 to the second input of 630.
  • Another arrow further connects the reference numerals 116, 216 and 226 collectively with a second evaluation algorithm, the is provided with the reference number 640.
  • the outputs of the first evaluation algorithms 610, 620, 630 and the second evaluation algorithm 640 are connected to the input of a so-called coordinator provided with the reference number 660.
  • the fusion object 445 which represents a fusion object - for example the first fusion object 410 - at a first time step.
  • the fusion object 445 for the first time step comprises three reference symbols 446, 447, 448, each of which represents measured values of the fusion object 445 for the first time step.
  • reference numeral 446 represents both the lateral offset of the object represented by the fusion object 445 for the first time step, ie its extension in the y direction, and the quality of the determination of the lateral offset of the object represented by the fusion object 445 for the first time step
  • reference numeral 447 represents both the measured width of the object represented by the fusion object 445 for the first time step and the quality of the determination of the width of the object represented by the fusion object 445 for the first time step, while the reference numeral 448 for further measured values and attributes relating to the Fusion object 445 stands for the first time step.
  • the fusion object 455 for the second time step comprises three reference numerals 456, 457, 458, which each represent measured values of the fusion object 455 for the second time step.
  • reference numeral 456 represents both the lateral offset of the object represented by the fusion object 455 for the second time step, ie its extension in the y direction, and the quality of the determination of the lateral offset of the object represented by the fusion object 455 for the second time step
  • reference symbol 457 represents both the measured width of the object represented by the fusion object 455 for the second time step and the quality of the determination of the width of the object represented by the fusion object 455 for the second time step, while the reference symbol 458 for further measured values and attributes relating to the Fusion object 455 stands for the second time step. It is assumed in the example shown in FIG.
  • the processing unit - for example in an association matrix - and represent an object 10, 20.
  • the aim of the evaluation shown is, for example, to determine the width of this object. Alternatively, a different extension could of course also be determined.
  • the output of the coordinator 660 is connected in FIG. 15 with an arrow with the reference symbol 457, which is intended to express that the processing value in the first evaluation algorithms 610, 620, 630, the second evaluation genealogy algorithm 640 and the coordinator 660 is the measured value or the estimated value the width of the object represented by the fusion objects 445 and 455 from the first Time step (fusion object 445) for the second time step (fusion object 455) was updated or improved.
  • the first evaluation algorithm 600 as shown in FIG. 16, is described in more detail below.
  • the first evaluation algorithm 600 has a value for the transverse deposit and a quality value for the transverse deposit.
  • the first evaluation algorithm 600 has - at least potentially - a value for the width and a quality value for the width.
  • a query is made as to whether width information is available at the second input 602. If this is the case, a branch is made to a method step provided with the reference symbol 605, which forwards the width available at the second input 602 and its quality information as output variables of the first evaluation algorithm 600 to the output 607 of the first evaluation algorithm 600.
  • step 603 a branch is made in step 603 to a method step provided with the reference symbol 609, in which the history of the measurement data for transverse storage (that at the first input 601 starting from the corresponding sensor object or starting from the central memory of the processing unit) 400, in which the tracking data of the individual sensor objects are available) an average value of the lateral offset is calculated.
  • the information about the mean value is then forwarded to a method step provided with the reference symbol 604, in which a decision is made as to whether the transverse deposit fluctuates around its mean value.
  • a branch is made to a further method step, provided with the reference symbol 606, in which an estimate for the width of the object in question is generated from the fluctuations in the transverse deposit, which is then connected to the output 607 of the first Evaluation algorithm 600 is passed on. If the lateral storage does not fluctuate around its mean value in step 604, the first evaluation algorithm 600 is either terminated by a further method step shown in FIG. 16 and provided with the reference number 608, or the value (not shown in FIG. 16) is sent to the output 607 "zero" as the estimated or calculated width of the object in question or the absence of a width is coded in the associated quality value.
  • the second evaluation algorithm 640 generates an estimated value for the total width of the data from the information available to it based on the values and grades for cross-filing provided with the reference numerals 116, 216, 226 by means of a "min-max evaluation" Sensor objects 110, 210, 220 represented object.
  • the coordinator 660 generates from the latitude information and the quality information for the latitude information, which are supplied by the first and second evaluation algorithms 610, 620, 630, 640, latitude information which, as shown in FIG. 15, is fed to the fusion object 455 at the second time step ,
  • the method for determining the object size of objects tries to provide an estimate for the extent by searching the sensor objects arriving from the sensors or the object lists for direct and indirect information about the extent of detected objects.
  • One advantage of determining the size at the level of the fusion objects is the redundancy of the incoming data, which is often present. From multiple measurements of the same object over several Size information can be obtained from measuring cycles. Another advantage is the ability to process inhomogeneous size information.
  • the combination of different methods makes it possible to merge both measured object extents and sizes obtained from position measurements, ie to process them on the level of fusion objects.
  • To determine the size of the object several object attributes are taken into account, for example the longitudinal distance, the lateral transverse offset, possibly the object width, object height and object length.
  • Object size build-up from a time-based reflex observation The reflex migration of a radar sensor can be exploited. This requires stable detection of an object over a longer period of time.
  • Object plausibility is used to assess the stability and duration. Above a defined threshold of object plausibility, an object is sufficiently plausible to provide information about the latitude and longitude. At least the standard deviations of the measurement errors are subtracted from the measured minimum and maximum values of the distance and the lateral offset in order to generate a statistically secured expansion.
  • Suitable reflex hikes occur, among other things, in wide curves or during entry and exit procedures, for example on motorways. In order to prevent two separate real objects from being incorrectly used to generate the width, there are jumps in the Data not allowed. This sub-procedure is only used for sufficiently smooth measurement data.
  • Object size build-up from a spatial reflex observation The point-like reflex centers of several individual sensors can be processed. If a single real object is detected by several sensors at different reflection points, an expansion can be built up after just a few measuring cycles. The object width and object length are determined from the minimum and maximum values. The sensor measurements are assigned to one and the same real object 10, 20 in the association step in the processing unit 400.
  • Object size build-up from a spatial fusion of sizes If several individual sensors are used that provide width and / or height and / or length information, extended object widths, heights and lengths can be determined during the processing step of merging, i.e. when the individual objects are merged. The individual dimensions are combined. The resulting sizes are therefore always larger than the individual dimensions.
  • Object size build-up using combined methods If a point measurement (e.g. using radar sensors) is assigned to an existing, tracked fusion object with size information (e.g. using video sensors), the size can be expanded if there is sufficient plausibility, for example by plausible measurements on the edge and outside of the previous object edge.
  • a point measurement e.g. using radar sensors
  • size information e.g. using video sensors
  • One of the aspects of the present invention relates to strategies for weighting object data when creating or updating fusion objects.
  • the Weighting of individual sensor data within the fusion objects is to be carried out intelligently according to the invention.
  • the data supplied by various sensors and assigned to the same fusion object by an association step (which in their entirety represent a real object 10, 20) are merged or associated to form a single data record of the fusion object.
  • the aim is to achieve the greatest possible accuracy in the data of the fusion objects. Due to different physical measurement principles, the individual sensors have different detection properties. In addition, there may also be sample variances.
  • the method according to the invention uses additional information about the quality of the data supplied by the method.
  • the main advantage of intelligent weighting of different sensor data is the best possible use of the information about the accuracy of the individual data.
  • the input data can consist of sensor measurement data or of existing and tracked fusion objects or a mixture of both. Another advantage of the method according to the invention is the possibility of fusing fusion objects and individual sensor objects in any desired composition.
  • a quality measure exists for each object date. This measure of quality can be fixed statically or can be supplied dynamically with each measurement.
  • One of the main steps in processing sensor data is Data association in which the available data are assigned to the existing fusion objects. The weighting is done using the quality measure of the individual sensors. The worse the quality of a date, the less weight it has. If there is a number n of data a. 1,. , , , a n before the merger, this is calculated
  • W j ⁇ is greater than or equal to zero for all i and the sum over all W j _ is equal to one.
  • All attributes with continuous values such as distance, speed, acceleration, width, height, length, are object data and can be merged in this way.
  • attributes with discrete values for example the number of tracking cycles, the plausibility
  • Measurement errors of measurands typically have statistical distributions. The variances of such distributions can be used as precision measures, for example. Alternatively, other scalar quality measures are also possible. With regard to a single object attribute, however, a uniform definition of the quality measure must be used.
  • the weights w. j _ are used in a standardized form according to the reciprocal values of the individual quality measurements.
  • a. unc ⁇ a 2 with the variances ⁇ - ⁇ ⁇ - and ⁇ 2 * ⁇ 2 are the weights:
  • the weights w. j _ of the individual sensors are chosen so that the variance of the fusion date is minimal. For this, a multi-dimensional, quadratic optimization problem is solved.
  • the weights depend on the number and values of the individual variances. They can be specified in a closed formula. In the case of two individual data a x and a 2 with the variances ⁇ 1 * ⁇ 1 and ⁇ 1 * ⁇ 1, the weights are:
  • the plausibility describes how safely and reliably an object is detected. It accompanies a fusion object throughout its life.
  • object plausibility plays an important role.
  • the plausibility can be passed on to vehicle guidance systems or driver assistance systems as an essential object attribute, in particular a fusion object. This increases the level of detail of a driving environment detected by sensors. Classification and interpretation benefit in particular a recorded driving environment based on the quality of the plausibility.
  • the plausibility is created as an attribute of an object, that is to say in particular a fusion object.
  • the plausibility contains the object history and specifies how reliably an object is detected.
  • the incrementing and decrementing of the plausibility depends on various influencing factors.
  • the plausibility can be defined as a scalar measure. If a new fusion object is created, the plausibility is set to zero or to a value that corresponds to the number of detecting sensors. If the fusion object continues to be detected, the plausibility is continuously increased. An object is considered plausible above a threshold. If measurement misfires occur, the plausibility is reduced accordingly. If the object is not detected over several cycles because it no longer exists or has moved out of the detection range of all the individual sensors used, the plausibility is successively reduced. If it falls below a defined threshold, the fusion object is no longer considered plausible. If the plausibility is sufficiently low, a fusion object is deleted.
  • the plausibility can be normalized to the interval [0.1]. In this case, a compact range of values is defined. Appropriate discretization allows the required storage space to be determined regardless of the time. Will a change in a plausibility value calculated, which would result in leaving the interval [0.1], the changed value is limited by a limiter downwards by zero and upwards by one. If an increment or decrement results in arithmetical values between the discretization levels, you can round to the nearest discretization value.
  • the basic increment denotes the smallest unit of a plausibility change. This increment can be constant or variable. Different variants are possible:
  • a constant value, around 0.1, can be used as the basic increment.
  • An exponential value can be selected as the basic increment. If the plausibility lies approximately in the interval [0.1], only small changes take place near 0 and 1. The changes are greatest at 0.5. The values 0 and 1 are only reached asymptotically.
  • the increments and decrements of the plausibility measure are determined depending on the number and quality of the objects of the individual sensors.
  • the increment can e.g. proportional to the number of individual objects.
  • the plausibility is recalculated in time with the sensor data fusion.
  • the cycles of the sensor data fusion can be of the same or of a different cycle length.
  • the plausibility of the individual sensors can be weighted equally.
  • the sensors are treated equally based on the age of the data supplied by the sensors and the characteristic sensor cycle time. If within a clock of sensor data fusion, i.e. Decrementing takes place within the time period in which the processing algorithm is repeated, an object could be measured based on the sensor cycle time, but is actually not measured, otherwise not.
  • a fusion object is considered plausible for applications if its plausibility measure is above a specified threshold.
  • a hysteresis can be built into this threshold. If the plausibility lies in the interval [0.1], e.g. 0.3 such a threshold. With hysteresis, this value can be set to 0.4 with increasing plausibility and to 0.2 with decreasing plausibility.
  • FIG. 17 shows a scheme for plausibility management.
  • a fusion object 445 at a first time step Such a fusion object comprises, as an attribute, a plausibility measure, which is identified in FIG. 17 for the fusion object 445 for the first time step with the reference symbol 449.
  • a plausibility measure which is identified in FIG. 17 for the fusion object 445 for the first time step with the reference symbol 449.
  • sensor objects that are supplied by sensors 100, 200, 300 were assigned to fusion object 445 in the first time step.
  • the plausibility measure for the fusion object 445 for the first time step is now to be updated for the second time step.
  • This update represents the fusion object 455 also shown in FIG. 17 for the second time step, which also includes a plausibility measure, which is identified by the reference symbol 459.
  • a method step denoted by reference numeral 672 is carried out, at which, starting from the fusion object 445 at the first point in time and starting from the plausibility measure 449, an arrow points in each case and which calculates the base increment based on the plausibility measure 449. Then, starting from the method step identified by reference number 672, a further method step designated by reference number 673 is carried out. To this end, an arrow points from reference number 672 to reference number 673.
  • the plausibility is incremented or decremented, specifically as a function of information which, in direct or indirect manner, in method step identified by reference number 673, also starting from those in FIG shown sensors 100, 200, 300 are present.
  • the value for the plausibility checked in the method step provided with the reference symbol 674 is then made available to the fusion object 455 in the second time step, which is identified by arrows from the reference symbol 674 in each case to the fusion object 455 for the second time step and also its plausibility attribute 459 in FIG. 17.
  • the sensors 100, 200, 300 each deliver sensor objects which are associated with the fusion objects 445, 455 for the first and for the second time step.
  • the sensors 100, 200, 300 provide information which is processed in a method step which is carried out separately for all sensors but in a uniform manner and is provided with the reference number 670, which is shown in FIG. 17 with an arrow from each of the sensors 100, 200, 300 is shown with reference numerals 670 present in a corresponding plurality.
  • the processing in the method step denoted by reference number 670 includes in particular the determination of the sensor quality and the determination of the data age.
  • the information supplied by the sensors 100, 200, 300 and processed and processed in the method steps provided with the reference number 670 is made available to a method step identified with the reference number 671, which is indicated by an arrow pointing to the reference number 671, starting from the reference number 670 in FIG 17 is shown.
  • the one designated by reference number 671 In the method step, the sensor data originating from the individual sensors 100, 200, 300 are weighted and the result is made available to the method step identified by reference number 673, as has already been described above and is shown in FIG. 17 by an arrow from reference number 671 to reference number 673 ,
  • One of the aspects of the present invention relates to a method for prioritizing fusion objects.
  • the task is to carry out an optimized selection of relevant fusion objects.
  • a fixed and limited memory area is usually provided for the fusion object list.
  • the forwarding of the information from the fusion objects via the bus system B means additional effort, which must also be within the limits of the available resources. If several individual sensors are used and all objects of the individual sensors are taken into account, the list of fusion objects can be significantly longer than the lists of individual sensors. If the number of fusion objects is limited, a selection must be made.
  • Such prioritization of fusion objects is application-specific. It is particularly advantageous in the method according to the invention that a method or a device for prioritizing fusion objects is used, in which or in which there is a concentration on the sensor information relevant to a defined application. If only relevant information is processed and forwarded, this saves resources and increases the speed of information processing. This is advantageous in highly dynamic situations or Dangerous situations (e.g. when an automatic emergency braking function is active) in which the cycle rate of the sensor data fusion can then be increased. In such situations, a reduced data fusion can only take place on the data of the essential - ie prioritized - objects. A selection is made from all potential merger objects. The selection is made in such a way that the fusion object list always contains the most relevant objects.
  • the prioritization is based on an internal ranking, i.e. the creation of a list of priorities.
  • the ranking is based on a priority measure.
  • Fusion objects are sorted and managed according to their relevance. The highest priority merged objects are the most relevant and remain in the list of merged objects. If new fusion objects arise, their priority measures are compared with those of the already existing fusion objects. If there are more potential fusion objects than list positions, the least relevant objects are removed from the fusion object list or not added at all.
  • a scalar measure can be used to describe the priority.
  • a fine discretization of the priority measure should be used if possible. Standardization, for example to the interval [0.1], is possible. However, it must then be ensured that the saturation of the priority to the value 1 is practically not achieved. Otherwise, no clear ranking can be drawn up.
  • Application-dependent selection of influences on the priority Function-dependent information is used to calculate the priority measure. This information is provided by the application. Possible data are the lane and the driving tube of your own vehicle or the distance and the relative speed to a vehicle in front in your own or an adjacent lane. Depending on the application, the vehicle's own speed can also influence the relevance of objects. Further information relevant to prioritization is, for example, the steering angle, the yaw rate or the wheel speed.
  • the priority is recalculated in time with the sensor data fusion. Information from the application is evaluated to determine the priority. It is possible to use fixed priority constants for events. For a priority measure standardized to the interval [0,1], such constants can be in the percentage or alcohol range. For each event that occurs, the associated constant is added to the priority measure. The level of the constant defines its importance. Multiple events can be overlaid. Events of an object are e.g. the presence of the object in its own lane, a high deceleration value of a vehicle cutting in, or the use of the object as a target object by a controller (e.g. ACC controller).
  • a controller e.g. ACC controller

Abstract

L'invention concerne un procédé et un dispositif d'échange et traitement en commun de données objet entre des capteurs (100, 200, 300) et une unité de traitement (400). Selon cette invention, des informations de position et/ou des informations de vitesse et/ou d'autres attributs (dimension, identification, repères) d'objets (110, 120, 210, 220) de capteurs et d'objets de fusion (410, 420, 430) sont transmises et traitées.
EP02735078A 2001-07-17 2002-06-07 Procede et dispositif d'echange et de traitement de donnees Withdrawn EP1412773A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE10133945A DE10133945A1 (de) 2001-07-17 2001-07-17 Verfahren und Vorrichtung zum Austausch und zur Verarbeitung von Daten
DE10133945 2001-07-17
PCT/DE2002/001930 WO2003008995A1 (fr) 2001-07-17 2002-06-07 Procede et dispositif d'echange et de traitement de donnees

Publications (1)

Publication Number Publication Date
EP1412773A1 true EP1412773A1 (fr) 2004-04-28

Family

ID=7691567

Family Applications (1)

Application Number Title Priority Date Filing Date
EP02735078A Withdrawn EP1412773A1 (fr) 2001-07-17 2002-06-07 Procede et dispositif d'echange et de traitement de donnees

Country Status (4)

Country Link
US (1) US7340380B2 (fr)
EP (1) EP1412773A1 (fr)
DE (1) DE10133945A1 (fr)
WO (1) WO2003008995A1 (fr)

Families Citing this family (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE519803C2 (sv) * 2001-08-06 2003-04-08 Ericsson Telefon Ab L M Metod och anordning för analys av sensorsystems prestanda
JP3941765B2 (ja) 2003-09-11 2007-07-04 トヨタ自動車株式会社 物体検出装置
JP3918791B2 (ja) 2003-09-11 2007-05-23 トヨタ自動車株式会社 物体検出装置
CN1981251A (zh) * 2004-06-22 2007-06-13 拉比特合资有限公司 信号处理方法和装置
US20060178857A1 (en) * 2005-02-10 2006-08-10 Barajas Leandro G Quasi-redundant smart sensing topology
JP4557819B2 (ja) * 2005-06-21 2010-10-06 アルパイン株式会社 車両周辺情報提供装置
JP4813141B2 (ja) * 2005-10-05 2011-11-09 川崎重工業株式会社 情報提供装置
JP4892559B2 (ja) * 2005-10-14 2012-03-07 コンチネンタル オートモーティヴ システムズ ユーエス,インコーポレイテッド 車両衝突データ発生方法及び車両衝突検知システム
JP4595833B2 (ja) 2006-02-24 2010-12-08 トヨタ自動車株式会社 物体検出装置
US7610127B2 (en) * 2006-03-08 2009-10-27 Delphi Technologies, Inc. Vehicle stability monitoring system and method and article of manufacture for determining vehicle stability
JP2007241726A (ja) * 2006-03-09 2007-09-20 Denso Corp 運転支援システム、送信装置及び受信装置
DE102006052779A1 (de) * 2006-11-09 2008-05-15 Bayerische Motoren Werke Ag Verfahren zur Erzeugung eines Gesamtbilds der Umgebung eines Kraftfahrzeugs
DE102007002197A1 (de) * 2007-01-16 2008-07-17 Siemens Ag Gemeinsamer Kontroller für verschiedene Fahrerassistenzsysteme
DE102007018470A1 (de) * 2007-04-19 2008-10-23 Robert Bosch Gmbh Fahrerassistenzsystem und Verfahren zur Objektplausibilisierung
US7653513B2 (en) * 2007-06-20 2010-01-26 At&T Intellectual Property Ii, L.P. Sensor registration by global optimization procedures
DE102007042481B4 (de) * 2007-09-06 2022-04-07 Zf Cv Systems Hannover Gmbh Fahrzeugsteuersystem für einen Kraftwagen
FR2929411B1 (fr) * 2008-03-28 2010-06-11 Thales Sa Procede et systeme de pistage et de suivi d'emetteurs.
US20090254260A1 (en) * 2008-04-07 2009-10-08 Axel Nix Full speed range adaptive cruise control system
FR2933221B1 (fr) * 2008-06-26 2013-04-05 Renault Sas Procede de fonctionnement d'un systeme de detection d'obstacles destine a etre embarque sur un vehicule automobile.
US20100134285A1 (en) * 2008-12-02 2010-06-03 Honeywell International Inc. Method of sensor data fusion for physical security systems
US8165769B2 (en) * 2009-03-02 2012-04-24 GM Global Technology Operations LLC Multi-factor speed estimation system and method for use
US20100318257A1 (en) * 2009-06-15 2010-12-16 Deep Kalinadhabhotla Method and system for automatically calibrating a three-axis accelerometer device
DE102010063984A1 (de) * 2010-02-11 2011-08-11 Continental Teves AG & Co. OHG, 60488 Fahrzeug-Sensor-Knoten
US8412406B2 (en) * 2010-08-13 2013-04-02 Deere & Company Method and system for performing diagnostics or software maintenance for a vehicle
US8717422B2 (en) * 2010-12-22 2014-05-06 Texas Instruments Incorporated Multi-sensor video frame synchronization apparatus and methods
EP2604478B2 (fr) 2011-12-13 2021-03-31 Aptiv Technologies Limited Procédé de détection d'erreurs de fonctionnement d'un dispositif multicapteurs
DE102012103669A1 (de) 2012-04-26 2013-10-31 Continental Teves Ag & Co. Ohg Verfahren zur Darstellung einer Fahrzeugumgebung
DE102012211391A1 (de) * 2012-07-02 2014-01-02 Continental Teves Ag & Co. Ohg Verfahren und System zur Informationsnutzung
DE102012106932A1 (de) 2012-07-30 2014-05-15 Continental Teves Ag & Co. Ohg Verfahren zur Darstellung einer Fahrzeugumgebung mit Positionspunkten
US9405006B2 (en) * 2012-09-03 2016-08-02 Toyota Jidosha Kabushiki Kaisha Collision determination device and collision determination method
US9218538B2 (en) 2013-01-30 2015-12-22 Xerox Corporation Methods and systems for detecting an object borderline
US9571562B2 (en) * 2013-03-15 2017-02-14 Dana Limited System and method for data collection and analysis using a multi-level network
US9350800B2 (en) * 2013-06-05 2016-05-24 Microsoft Technology Licensing, Llc Defragmenting clusters with reserved resources
CH708274A1 (de) * 2013-07-04 2015-01-15 Schweizerische Eidgenossenschaft Eidgenössisches Dept Für Verteidigung Bevölkerungsschutz Und Sport Verfahren zur Bestimmung von Trajektorien beweglicher physischer Objekte in einem Raum, auf der Basis von Sensordaten mehrerer Sensoren.
EP2865576B1 (fr) * 2013-10-22 2018-07-04 Honda Research Institute Europe GmbH Estimation de confiance composite pour systèmes d'assistance de conducteur prédictifs
EP2865575B1 (fr) 2013-10-22 2022-08-24 Honda Research Institute Europe GmbH Estimation de confiance pour systèmes d'assistance de conducteur prédictifs sur la base de règles de plausibilité
KR101480651B1 (ko) * 2013-12-09 2015-01-09 현대자동차주식회사 오브젝트 처리 방법 및 이를 지원하는 차량
DE102014205180A1 (de) * 2014-03-20 2015-09-24 Robert Bosch Gmbh Verfahren und Vorrichtung zum Betreiben eines Fahrzeugs
EP3786949B1 (fr) * 2014-05-01 2022-02-16 Nippon Telegraph And Telephone Corporation Codage d'un signal sonore
CN104104397B (zh) * 2014-06-20 2017-01-18 杭州电子科技大学 一种多频段毫米波通信发射机
JP6484000B2 (ja) * 2014-10-22 2019-03-13 株式会社デンソー 物体検知装置
CN105806320B (zh) * 2014-12-29 2020-04-21 同方威视技术股份有限公司 拍摄测量系统以及拍摄测量方法
DE102015001757B4 (de) 2015-02-11 2018-01-04 Audi Ag Verfahren und System zum Betreiben mehrerer Kraftfahrzeuge
KR101714145B1 (ko) * 2015-04-09 2017-03-08 현대자동차주식회사 주변차량 식별 장치 및 그 방법
US10328949B2 (en) * 2016-01-28 2019-06-25 Toyota Motor Engineering & Manufacturing North America, Inc. Sensor blind spot indication for vehicles
EP3232285B1 (fr) * 2016-04-14 2019-12-18 Volvo Car Corporation Procédé et agencement destinés à surveiller et à adapter la performance d'un système de fusion d'un véhicule autonome
SE1650719A1 (en) * 2016-05-25 2017-11-26 Scania Cv Ab Method for decentralized sensor fusion in a vehicle and sensor fusion system
FR3054684B1 (fr) 2016-07-29 2018-08-24 Institut Vedecom Systeme de pilotage d’un vehicule autonome
DE102016220075A1 (de) * 2016-10-14 2018-04-19 Audi Ag Kraftfahrzeug und Verfahren zur 360°-Umfelderfassung
US10073456B2 (en) * 2016-11-17 2018-09-11 GM Global Technology Operations LLC Automated co-pilot control for autonomous vehicles
DE102017210975A1 (de) 2017-06-28 2019-01-17 Audi Ag Verfahren zur Datenerhebung
US10850732B2 (en) * 2017-09-05 2020-12-01 Aptiv Technologies Limited Automated speed control system
US10971017B2 (en) 2017-10-31 2021-04-06 Cummins Inc. Sensor fusion and information sharing using inter-vehicle communication
DE102018216809A1 (de) * 2018-09-28 2020-04-02 Robert Bosch Gmbh Verfahren, Vorrichtung und Sensorsystem zur Umfelderfassung für ein Fahrzeug
CN110376583B (zh) * 2018-09-30 2021-11-19 毫末智行科技有限公司 用于车辆传感器的数据融合方法及装置
CN110378178B (zh) * 2018-09-30 2022-01-28 毫末智行科技有限公司 目标跟踪方法及装置
US10839263B2 (en) 2018-10-10 2020-11-17 Harman International Industries, Incorporated System and method for evaluating a trained vehicle data set familiarity of a driver assitance system
KR20200040404A (ko) * 2018-10-10 2020-04-20 주식회사 만도 차량용 레이더 장치 및 그 제어 방법
KR102569900B1 (ko) * 2018-12-04 2023-08-23 현대자동차주식회사 전방위 센서퓨전 장치 및 그의 센서퓨전 방법과 그를 포함하는 차량
DE102018133457B4 (de) * 2018-12-21 2020-07-09 Volkswagen Aktiengesellschaft Verfahren und System zum Bereitstellen von Umgebungsdaten
DE102019102769A1 (de) * 2019-02-05 2020-08-06 Bayerische Motoren Werke Aktiengesellschaft Verfahren und eine Vorrichtung zur Sensordatenfusion für ein Fahrzeug
DE102019102920A1 (de) * 2019-02-06 2020-08-06 Bayerische Motoren Werke Aktiengesellschaft Verfahren und eine Vorrichtung zur Sensordatenfusion für ein Fahrzeug
DE102019202949A1 (de) * 2019-03-05 2020-09-10 Zf Friedrichshafen Ag Verteilte Verarbeitung von Radarsignalen
DE102020206659A1 (de) * 2019-05-30 2020-12-03 Robert Bosch Gesellschaft mit beschränkter Haftung Multi-hypothesen-objektverfologung für automatisierte fahrsysteme
US11178363B1 (en) * 2019-06-27 2021-11-16 Objectvideo Labs, Llc Distributed media monitoring
KR20210030524A (ko) * 2019-09-09 2021-03-18 주식회사 만도 차량 제어 장치 및 그 제어 방법
CN110596653A (zh) * 2019-09-24 2019-12-20 江苏集萃智能传感技术研究所有限公司 一种多雷达数据融合方法及装置
DE102019216517B3 (de) * 2019-10-25 2021-03-18 Daimler Ag Verfahren zur Synchronisation zumindest zweier Sensor-Systeme
DE102020107790A1 (de) 2020-03-20 2021-09-23 Audi Aktiengesellschaft Verfahren zur Erfassung von Objekten in einer Umgebung eines Kraftfahrzeugs und Kraftfahrzeug
DE102021205993A1 (de) 2021-06-14 2022-12-15 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Betreiben eines Scheinwerfersystems eines Kraftfahrzeugs
DE102022117706A1 (de) 2022-07-15 2024-01-18 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Assistenzsystem zur radarbasierten Größeneinstufung von Objekten und entsprechend eingerichtetes Kraftfahrzeug
DE102022123303A1 (de) * 2022-09-13 2024-03-14 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Objekttrackingeinrichtung für eine sensorbasierte Objektnachverfolgung und entsprechend eingerichtetes Kraftfahrzeug
CN115662168A (zh) * 2022-10-18 2023-01-31 浙江吉利控股集团有限公司 一种环境感知方法、装置及电子设备

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3546664C3 (de) * 1985-02-22 1995-10-26 Bosch Gmbh Robert Verfahren zum Betreiben einer Datenverarbeitungsanlage
US4860216A (en) * 1986-11-13 1989-08-22 The United States Of America As Represented By The Secretary Of The Air Force Communication adaptive multi-sensor system
US5005147A (en) * 1988-12-30 1991-04-02 The United States Of America As Represented By The Administrator, The National Aeronautics And Space Administration Method and apparatus for sensor fusion
US5202661A (en) * 1991-04-18 1993-04-13 The United States Of America As Represented By The Secretary Of The Navy Method and system for fusing data from fixed and mobile security sensors
US5317319A (en) * 1992-07-17 1994-05-31 Hughes Aircraft Company Automatic global radar/IR/ESM track association based on ranked candidate pairings and measures of their proximity
US5661666A (en) * 1992-11-06 1997-08-26 The United States Of America As Represented By The Secretary Of The Navy Constant false probability data fusion system
FR2709834B1 (fr) * 1993-09-10 1995-11-10 Framatome Sa Procédé et dispositif pour la détection et la localisation d'obstacles dans l'environnement d'un véhicule.
US5808916A (en) * 1994-08-04 1998-09-15 City Of Scottsdale Method for monitoring the environment
JP3468001B2 (ja) * 1996-12-16 2003-11-17 日産自動車株式会社 車両用走行制御装置
DE19734639A1 (de) * 1997-08-11 1999-02-18 Spherics Mess Und Analysetechn Recheneinrichtung zum Verarbeiten einer Eingangsdatenstruktur
US6157894A (en) * 1997-12-23 2000-12-05 Simmonds Precision Products, Inc. Liquid gauging using sensor fusion and data fusion
DE29811174U1 (de) * 1998-06-23 1998-11-26 Brecht Thomas Fahrtgeschwindigkeitsregelsystem mit integrierter, dynamischer Überwachung des Sicherheitsabstands
US6002358A (en) * 1998-08-18 1999-12-14 Northrop Grumman Corporation Method and apparatus for determining whether two radar tracks represent the same physical object
US7015789B1 (en) * 1999-05-13 2006-03-21 Honeywell International Inc. State validation using bi-directional wireless link
JP2003501635A (ja) * 1999-05-26 2003-01-14 ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング 対象検出システム
US6547692B1 (en) * 1999-06-12 2003-04-15 Robert Bosch Gmbh System for adjusting the tension of the continuous belt component of a CVT
DE19928915A1 (de) * 1999-06-24 2001-01-11 Bosch Gmbh Robert Verfahren zur Sichtweitenbestimmung
WO2001001366A2 (fr) * 1999-06-25 2001-01-04 Telemonitor, Inc. Procede et systeme de surveillance intelligent a distance
DE19945268A1 (de) * 1999-09-21 2001-03-22 Bosch Gmbh Robert Verfahren und Vorrichtung zur Zustandserkennung bei einem System zur automatischen Längs- und/oder Querregelung bei einem Kraftfahrzeug
DE19945250B4 (de) * 1999-09-21 2011-06-09 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zur Zustandserkennung bei einem System zur automatischen Längs- und/oder Querregelung bei einem Kraftfahrzeug
DE19950915B4 (de) * 1999-10-21 2004-06-17 Forschungszentrum Jülich GmbH Verfahren zur Bestimmung eines Ortes, an dem ein Detetkionssignal am wahrscheinlichsten erfolgt ist und Auswerteeinheit für ein Detektorsystem
DE10015164A1 (de) * 2000-03-27 2001-10-11 Helmut Klausing Kommunikations-Verfahren mit einem ROSAR-Gerät
WO2002033443A2 (fr) * 2000-06-14 2002-04-25 Vermeer Manufacturing Company Systeme et procede de cartographie d'equipements et de distribution de donnees
US6629033B2 (en) * 2001-04-24 2003-09-30 Medius, Inc. Open communication system for real-time multiprocessor applications
US7283904B2 (en) * 2001-10-17 2007-10-16 Airbiquity, Inc. Multi-sensor fusion

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DANA M P: "REGISTRATION: A PREREQUISITE FOR MULTIPLE SENSOR TRACKING", MULTITARGET-MULTISENSOR TRACKING: ADVANCED APPLICATIONS, XX, XX, 1 January 1990 (1990-01-01), pages 155 - 185, XP009015150 *
HALL L D ET AL: "AN INTRODUCTION TO MULTISENSOR DATA FUSION", PROCEEDINGS OF THE IEEE, IEEE. NEW YORK, US, vol. 85, no. 1, 1 January 1997 (1997-01-01), pages 6 - 23, XP000686448, ISSN: 0018-9219, DOI: DOI:10.1109/5.554205 *
See also references of WO03008995A1 *

Also Published As

Publication number Publication date
WO2003008995A1 (fr) 2003-01-30
US20050021201A1 (en) 2005-01-27
DE10133945A1 (de) 2003-02-06
US7340380B2 (en) 2008-03-04

Similar Documents

Publication Publication Date Title
WO2003008995A1 (fr) Procede et dispositif d'echange et de traitement de donnees
EP1135274B1 (fr) Procede et dispositif permettant de determiner la future trajectoire d'un vehicule
DE102019120118A1 (de) Vorrichtung und verfahren zum steuern des fahrens eines fahrzeugs
DE112012006226B4 (de) Fahrassistenzvorrichtung
WO2005037592A1 (fr) Procede et systeme d'identification de procedures de changement de voie pour un vehicule
DE102010020047A1 (de) Verfahren zur Anpassung eines für ein Abstandsregelsystem eines Fahrzeugs vorgegebenen Soll-Abstandes an eine momentane Verkehrssituation
EP1577682A1 (fr) Système de localisation d'objects pour véhicule automobile pour identifier de procedures de changement de voie
DE102019002790B4 (de) Verfahren zur Prädiktion einer Verkehrssituation für ein Fahrzeug
EP1912844B1 (fr) Procede pour produire des hypotheses relatives a l'environnement exterieur pour des fonctions d'assistance au conducteur
EP1808350A1 (fr) Procédure pour le contrôle d'un système de guidage longitudinal pour automobile
EP1690730A1 (fr) Système d'assistance au conducteur avec unité de décision rdondante
WO2018019454A1 (fr) Procédé et dispositif permettant de déterminer un modèle de chaussée pour un environnement de véhicule
EP3024709B1 (fr) Fourniture efficace d'informations d'occupation pour l'environnement d'un véhicule
DE102020215780A1 (de) Verfahren zur Auswahl eines automatisierten Fahrvorgangs mittels eines Fahrassistenzsystems
EP2964503B1 (fr) Estimation de la vitesse future et/ou distance d'un véhicule d'un point de référence et estimation de l'accélération future
DE102008063033A1 (de) Vorrichtung und Verfahren zur Erkennung von Kollisionen mit erhöhter funktionaler Sicherheit
EP2527221B1 (fr) Procédé d'opération d'un système d'assistance au conducteur de guidage longitudinal d'un véhicule automobile et véhicule automobile
EP1643269A1 (fr) Système d'assistance au conducteur avec logique floue
DE102004028591A1 (de) Verfahren zum Bereitstellen von fahrstreckenabhängigen Informationen
EP2353958A2 (fr) Procédé d'évaluation de données de capteur concernant l'environnement d'un véhicule automobile d'au moins un capteur d'environnement et véhicule automobile
DE102019129904A1 (de) Automatische Fahrkompetenzanalyse
DE102019201088A1 (de) Verfahren zum Erkennen von Fahrbahnmarkierungen
EP1590193B1 (fr) Systeme de guidage de vehicule
DE102019208890A1 (de) Verfahren zum zumindest teilautomatisierten Führen eines Kraftfahrzeugs
DE102019102919A1 (de) Verfahren, Vorrichtung, Computerprogramm und Computerprogrammprodukt zum Betreiben eines Fahrzeuges

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040217

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

17Q First examination report despatched

Effective date: 20040707

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20110721