WO2022122933A1 - Procédé et dispositif de détection d'objets dans l'environnement d'un véhicule - Google Patents

Procédé et dispositif de détection d'objets dans l'environnement d'un véhicule Download PDF

Info

Publication number
WO2022122933A1
WO2022122933A1 PCT/EP2021/085019 EP2021085019W WO2022122933A1 WO 2022122933 A1 WO2022122933 A1 WO 2022122933A1 EP 2021085019 W EP2021085019 W EP 2021085019W WO 2022122933 A1 WO2022122933 A1 WO 2022122933A1
Authority
WO
WIPO (PCT)
Prior art keywords
reference point
object frame
point
estimated
orientation
Prior art date
Application number
PCT/EP2021/085019
Other languages
German (de)
English (en)
Inventor
Ruediger Jordan
Diego GIL VAZQUEZ
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2022122933A1 publication Critical patent/WO2022122933A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles

Definitions

  • the invention relates to a method for detecting objects in the surroundings of a vehicle. Furthermore, the invention relates to a device, a computer program and a computer-readable medium for carrying out the method mentioned.
  • a high-resolution radar sensor system can be used, for example, to detect objects in the surroundings of a vehicle, such as a partially or fully automated motor vehicle or an autonomous robot.
  • Tracking algorithms such as Bayes filters, particle filters or Kalman filters can be used to process the radar data generated by the radar sensor system vehicle, allow.
  • Such algorithms can, for example, process raw data relating to a distance, a radial velocity and an azimuth angle.
  • Several measurement cycles are usually required for a sufficiently accurate estimate of quantities that cannot be measured directly, such as object position, object speed or object acceleration. This can lead to corresponding delays in the estimation of the variables mentioned, particularly in the event of major changes in the state of motion of the objects.
  • the first step is usually to select a suitable reference point necessary.
  • a center point of a point cluster assigned to the object can be determined for this purpose and this center point can be used as a reference point.
  • embodiments of the present invention make it possible to determine positions of objects in the vicinity of a vehicle using a single radar sensor with high accuracy and low latencies.
  • a first aspect of the invention relates to a computer-implemented method for detecting objects in an area surrounding a vehicle. That The method comprises at least the following steps: receiving radar data indicative of the objects; Determination of a first object frame for each object based on the radar data, wherein the first object frame indicates a position and/or orientation of the object estimated over several time steps and a position and/or orientation of the first object frame is defined by a defined number of first reference points in a two-dimensional coordinate system is; Determining a second object frame for each object based on the radar data, the second object frame indicating a position and/or orientation of the object measured in a current time step and a position and/or orientation of the second object frame by a defined number of second reference points in the two-dimensional coordinate system is fixed; Determination of pairs of reference points, each consisting of a first reference point and a second reference point; determining a reference point distance between the first reference point and the second reference point of each reference point pair; determining at least one preferred reference point based on
  • the method can be executed automatically by a processor, for example.
  • the radar data can have been generated, for example, using a radar sensor system for detecting the surroundings of the vehicle.
  • the vehicle may be an automobile, such as a car, truck, bus, or motorcycle. In a broader sense, a vehicle can also be understood as an autonomous, mobile robot.
  • the radar sensor system can be, for example, a single high-resolution radar sensor.
  • the radar sensor system can be positioned, for example, in the area of a front or rear bumper of the vehicle. However, other installation locations are also possible.
  • the vehicle can, for example, have other surroundings sensors such as a camera, an ultrasonic or lidar sensor, and/or vehicle dynamics sensors such as an acceleration sensor, wheel speed sensor or steering wheel angle sensor.
  • the vehicle can have a location sensor for determining an absolute Position of the vehicle using a global navigation satellite system such as GPS, GLONASS or similar. exhibit.
  • the vehicle can be equipped with a driver assistance system or robot control system for partially or fully automated activation of an actuator system of the vehicle based on the radar data.
  • the driver assistance system or robot control system can be implemented as hardware and/or software, for example in a control unit of the vehicle.
  • the actuator system can be configured, for example, to steer, accelerate, brake and/or navigate the vehicle.
  • the actuator can include, for example, a steering actuator, a brake actuator, an engine control unit, an electric motor or a combination of at least two of the examples mentioned.
  • reflection locations can be determined, at which radar beams emitted by the radar sensor system were reflected. For this purpose, for example, peaks can be detected in a frequency spectrum generated by (fast) Fourier transformation of the radar data.
  • the reflection locations can be defined, for example, by a radial distance, a radial speed and/or an azimuth angle.
  • the reflection locations can then be assigned to specific objects in the area surrounding the vehicle, for example by bundling, which can also be referred to as clustering.
  • the object frames can be generated, for example, by a special object frame generation algorithm.
  • the object frame generation algorithm can be, for example, an appropriately trained classifier, such as an artificial neural network or the like.
  • a geometry indicating an outer contour of an object for example in the form of a rectangle, parallelogram, trapezium or the like, can generally be used under a first or second object frame. be understood. It is possible that the first or second object frame comprises one or more inner edges in addition to the outer edges.
  • a first or second reference point can be understood, for example, as a corner point or a center point between two corner points of the first or second object frame.
  • the center can This can be the center of an outer edge or a frame center of the first or second object frame lying on a (possibly imaginary) diagonal.
  • the first object frame can be predicted with a corresponding dynamic model for a measurement time of the second object frame, for example. This ensures that the first object frame and the second object frame describe the same point in time when the reference points are assigned.
  • the two-dimensional coordinate system can be, for example, a vehicle coordinate system of the vehicle and/or a sensor coordinate system of the radar sensor system.
  • the origin of the two-dimensional coordinate system can be fixed, for example, on an axis of the vehicle or an installation location of the radar sensor system in the vehicle.
  • the estimated position and/or orientation of the object can be determined, for example, using a Kalman filter or any other suitable tracking filter or state estimator.
  • the reference points of a pair of reference points can, for example, be reference points which are arranged at similar locations on the first or second object frame, for example at similar corners and/or similar edges. For example, it is possible that a first reference point in the form of a lower left corner point of the first object frame is combined with a second reference point in the form of a lower left corner point of the second object frame to form a pair of reference points, etc.
  • the first object frame and the second object frame can each have the same number of reference points.
  • the distance between the reference points can be the length of an imaginary connecting line between the reference points of a pair of reference points.
  • the preferred reference point can be, for example, at least one of the two reference points of the pair of reference points with the smallest reference point distance, a predefined reference point from a set of the first and/or second reference points, or a newly defined reference point that is neither one of the first reference points nor is one of the second reference points, act.
  • the coordinate values of the preferred reference point and/or position values based on the coordinate values of the preferred reference point can be input into the above-mentioned Kalman filter.
  • an estimation error assigned to the preferred reference point for example in the form of a covariance matrix, for example a measurement covariance matrix, can also be determined.
  • the estimated position and/or orientation of the object can then also be updated based on the estimation error.
  • a second aspect of the invention relates to a device for data processing.
  • the device can be a computer, for example in the form of a control device, a server or the like.
  • the device includes a processor that is configured to carry out the method according to an embodiment of the first aspect of the invention.
  • the device may include hardware and/or software modules.
  • the device may include memory and data communication interfaces for data communication with peripheral devices.
  • Features of the method according to an embodiment of the first aspect of the invention can also be features of the device and vice versa.
  • a third aspect of the invention relates to a computer program.
  • the computer program comprises instructions which, when the computer program is executed by the processor, cause a processor to carry out the method according to an embodiment of the first aspect of the invention.
  • a fourth aspect of the invention relates to a computer-readable medium on which the computer program according to an embodiment of the third aspect of the invention is stored.
  • the computer-readable medium can be volatile or non-volatile data storage.
  • the computer-readable medium can be a hard drive, USB storage device, RAM, ROM, EPROM, or flash memory.
  • the computer-readable medium can also be a data communication network such as the Internet or a data cloud (cloud) enabling a download of a program code.
  • the radar data was generated using a radar sensor system for detecting the surroundings of the vehicle. At least one edge of the first object frame facing the radar sensor system is identified and selected first reference points are determined, which lie on the at least one edge of the first object frame facing the radar sensor system.
  • the pairs of reference points are each formed from a selected first reference point and a second reference point.
  • An edge facing the radar sensor system can be identified, for example, based on a position and/or orientation of the edge relative to the radar sensor system, for example relative to the origin of the two-dimensional coordinate system.
  • the first reference points lying on the identified edge or the identified edges can be, for example, corner points or points lying between two corner points, for example midpoints.
  • the preferred reference point is selected from the pair of reference points with the smallest reference point distance.
  • the preferred reference point can be selected or determined from those reference points with the greatest agreement. This allows the preferred reference point to be determined with a few simple calculation steps.
  • the smallest reference point distance is compared to a threshold value.
  • a predefined reference point can be used as the preferred reference point if the smallest reference point distance is greater than the threshold value.
  • the predefined reference point can be a reference point lying within the first or second object frame, for example a frame center point or (estimated) pivot point of the respective object. This embodiment can prevent the preferred reference point from being determined based on first and second reference points that are too far apart. Coarse estimation errors when updating the estimated position and/or orientation of the objects can thus be avoided.
  • the predefined reference point is a center point of the first object frame and/or the second object frame.
  • the estimated position and/or orientation of the objects can thus be updated with sufficient accuracy even if none of the first or second reference points is suitable as the preferred reference point. It is conceivable that the predefined reference point is also selected as the preferred reference point when several of the first or second reference points come into question as preferred reference points at the same time.
  • an estimation error with regard to the estimated position and/or orientation of the object is determined as a function of a position of the preferred reference point on the first object frame and/or the second object frame.
  • the estimation error can turn out to be greater the further away the preferred reference point is from the radar sensor system, for example from the origin of the two-dimensional coordinate system.
  • it can be determined, for example, whether the preferred reference point is on an edge facing or facing away from the radar sensor system and/or is a corner point, an outer edge center point or a frame center point of the respective object frame.
  • the estimation error can be determined using simple means and with sufficient accuracy.
  • a smaller estimation error is determined when the preferred reference point is a corner point and a larger estimation error is determined when the preferred reference point is not a corner point. Assuming that the radar sensor system usually detects corner points very precisely, the estimation error can be determined very easily in this way.
  • the reference point distances are weighted with probabilities.
  • the preferred reference point is determined based on a sum of the reference point distances weighted with the probabilities.
  • At least three pairs of reference points are determined for each object.
  • the probability that a suitable preferred reference point can be determined from the first or second reference points can be increased compared to embodiments where fewer than three pairs of reference points are determined for each object, can be significantly increased.
  • the first reference points are corner points of the first object frame and the second reference points are corner points of the second object frame.
  • the pairs of reference points are each formed from a corner point of the first object frame and a corresponding corner point of the second object frame.
  • the first reference points can be midpoints between two corner points of the first object frame and the second reference points can be midpoints between two corner points of the second object frame.
  • the pairs of reference points can each be formed from a center point of the first object frame and a corresponding center point of the second object frame.
  • the method further includes: determining a radial velocity and an azimuth angle for at least two reflection locations of each object based on the radar data; Determining estimated angle parameters of an angle function that describes the radial speed as a function of the azimuth angle, the estimated angle parameters for each object being determined based on the radial speeds and azimuth angles assigned to the object; determining an estimated pivot point for each object based on the radar data; determining a velocity for each object as a function of the estimated orientation, pivot point, and angular parameters of the object.
  • An azimuth angle can be understood as an angle between a vector of the radial velocity and a reference axis of the two-dimensional coordinate system.
  • the following function also known as the Doppler profile
  • v d C cos( ⁇ )+S sin( ⁇ ).
  • S for a second angle parameter
  • v d for the radial or Doppler velocity
  • for the azimuth angle.
  • the angle parameters can be calculated, for example, using the least squares method or another suitable optimization method from a plurality of radial velocity and azimuth angle values for each object. This embodiment enables fast and accurate Calculation of object speeds using a single radar sensor. In particular, the speeds of the objects can be calculated with sufficient accuracy without undesired delays occurring as a result when updating the respective object state of the objects.
  • At least one of the following calculation steps is carried out to determine the speed:
  • This embodiment allows the speed(s) of each detected object to be calculated particularly efficiently.
  • a corresponding speed estimation error for example in the form of a Covariance matrix for at least one of the above speeds are calculated and taken into account.
  • FIG. 1 schematically shows a vehicle with a control device according to an exemplary embodiment of the invention.
  • Fig. 2 schematically shows different modules of the control unit from Fig. 1.
  • FIG. 3 shows a coordinate system with object frames that were generated by the control device from FIG.
  • FIG. 4 shows a coordinate system with object frames that were generated by the control device from FIG.
  • FIG. 5 shows a diagram to illustrate geometric quantities such as are used for object detection in a method according to an embodiment of the invention.
  • Fig. 1 shows a vehicle 100 with a radar sensor system 102 for detecting objects 104 in an area surrounding the vehicle 100, here by way of example a vehicle driving ahead 104, and a control unit 106 for evaluating radar data 108 generated by the radar sensor system 102.
  • the radar sensor system 102 transmits radar beams 109 from which at several reflection sites 110 of the preceding vehicle 104, such as at its rear and / or its Rear axle, and from there are partially reflected back to the radar sensor system 102.
  • the radar data 108 can be generated by appropriate pre-processing of the reflected portion of the radar beams 109, for example by demodulation, amplification and/or digitization.
  • Control unit 106 is configured to carry out a method, which is described in more detail below, for detecting object 104 based on radar data 108 .
  • control unit 106 includes a memory 112 on which a computer program is stored, and a processor 114 for executing the computer program, it being possible for the method mentioned to be carried out by executing the computer program.
  • control unit 106 based on radar data 108, to generate a control signal 116 for automatically activating an actuator system 118 of vehicle 100, such as a steering or brake actuator, an engine control unit, or an electric motor of vehicle 100.
  • vehicle 100 can be equipped with a suitable driver assistance function, which can be integrated into control unit 106 .
  • vehicle 100 can be an autonomous robot with a suitable robot control program for automatically controlling actuator system 118 .
  • control unit 106 shows various modules of control unit 106. These can be hardware and/or software modules.
  • the radar data 108 are first received in a bundling module 200 .
  • the bundling module 200 can be configured, for example, to locate the reflection sites 110 based on the radar data 108 as points in a two-dimensional coordinate system 202 and to bundle these points into clusters, each of which is associated with a detected object 104, here the vehicle 104 driving ahead.
  • the coordinate system 202 can be, for example, a sensor coordinate system of the radar sensor system 102, the origin of which can be fixed at the installation site of the radar sensor system 102.
  • the x-axis of the coordinate system 202 extends in the longitudinal direction of the vehicle 100 here.
  • the cluster data 204 is input to an object frame generation algorithm that may be configured to generate a first object frame 208 and a second object frame 210 for each object 104 from the cluster data 204 .
  • the object frame generation algorithm can be an appropriately trained classifier, for example in the form of an artificial neural network.
  • the first object frame 208 indicates a position and/or orientation of the object 104 estimated over several time steps by a state estimator (not shown).
  • the state estimator can be a Kalman filter, for example, or can be based on a Kalman filter.
  • the first object frame 208 is generated in the form of a rectangle whose position and/or orientation in the coordinate system 202 is defined by a defined number of first reference points 212, here by nine first reference points 212 (marked in bold in Fig. 2 ).
  • Second object frame 210 indicates a position and/or orientation of object 104 measured by radar sensor system 102 in a current time step.
  • the second object frame 210 is also generated in this example in the form of a rectangle whose position and/or orientation in the coordinate system 202 is defined by a defined number of second reference points 214, here by nine second reference points 214 (in Fig 2 marked with crosses).
  • the first object frame 208 and the second object frame 210 can have an identical number of reference points 212 and 214, respectively.
  • the reference points 212 or 214 can be corner points, outer edge centers or frame centers of the first object frame 208 or the second object frame 210 .
  • the reference points 212 or 214 are also possible.
  • the object frame generation module 206 determines a plurality of reference point pairs 216, each consisting of a first reference point 212 and a second reference point 214.
  • the first reference points 212 can each have a similarly positioned second reference point 214, as indicated schematically in FIG.
  • Reference point distances between the two reference points 212, 214 of each reference point pair 216 are then determined by the object frame generation module 206.
  • the reference point distances can be calculated, for example, in each case as the length of an (imaginary) straight connecting line (shown as a dashed line in FIG. 2) between the respective reference points 212, 214.
  • the object frame generation module 206 determines one or more preferred reference points 220. For example, the first reference point 212 or the second reference point 214 of the reference point pair 216 with the smallest reference point distance can be selected as the preferred reference point 220. However, other selection methods are also possible, as described in more detail below.
  • the state estimator Based on the preferred reference point 220, the state estimator finally updates the estimated position and/or orientation of the object 104 and outputs correspondingly updated position values d x ,d y for the position of the object 104 and/or a correspondingly updated orientation value ⁇ for the orientation of the object 104 .
  • At least three reflection locations 110 for the object 104 should be determined.
  • the first object frame 208 or the second object frame 210 can then be placed around these at least three reflection locations 110 .
  • the object frame generation module 206 may be configured to generate vertices and/or midpoints between two vertices, i. H. Edge midpoints or frame midpoints to be determined as reference points 212 and 214, respectively (see also Fig. 3 and Fig. 4).
  • object frame generation module 206 identifies edges of first object frame 208 facing radar sensor system 102, for example based on their respective position and/or orientation relative to the radar sensor system 102, and selected first reference points 212' are determined which lie on these identified edges (marked with bold lines in FIGS. 3 and 4).
  • the edges identified as facing can be, for example, two edges that collide at a corner point of the first object frame 208 that is closest to the radar sensor system 102 .
  • the object frame generation module 206 can determine at least three selected first reference points 212 ′ for the object 104 . In this example (see FIGS.
  • first reference points 212 which lie on the edges facing radar sensor system 102 but are furthest away from radar sensor system 102 and can therefore be detected less reliably, are discarded .
  • both corner points of this edge can also be determined as selected first reference points 212' . This ensures that three reference points 212 ′ are always selected as candidates for updating the estimated position and/or orientation of the object 104 .
  • the preferred reference point 220 can be determined using the reference points 212' or
  • 214 can be selected, for example, as follows.
  • three pairs of reference points 216 are formed from one each of the selected first reference points 212' and one of the second reference points 214. Only reference points of the same type are combined with one another.
  • the left, rear corner point of the first object frame 208 is combined with the left, rear corner point of the second object frame 210, etc.
  • the corresponding reference point distances are then determined.
  • the reference points 212', 214 that are closest to each other are finally selected for update (marked with small circles in Figures 3 and 4).
  • a predefined reference point can be selected if the reference point spacing of the selected reference points 212', 214 exceeds a specific threshold value.
  • the predefined reference point can be a frame center point of the first object frame 208 or of the second object frame 210, for example.
  • an estimation error can also be calculated in the form of a covariance matrix. The estimation error can occur depending on a respective position of the preferred reference point 220 . Three cases can be distinguished here.
  • the preferred reference point 220 is a corner point. It is assumed that its position was recorded relatively accurately. Accordingly, a small-magnitude circular covariance K is determined.
  • the preferred reference point 220 is a midpoint of an outer edge. With regard to the position of the preferred reference point 220, a higher measurement uncertainty than in the first case is assumed. A covariance K is thus determined, which is expanded to form an ellipse in the direction of the respective outer edge, it being possible for the length of the ellipse to depend on the length of the respective outer edge.
  • the preferred reference point 220 is in the middle of the respective object frame 208 or 210. It was therefore not possible to find a suitable reference point on the object surface, i. H. at the outer edges of the object frame 208 and 210, respectively. Accordingly, a comparatively high covariance K is determined, which can depend on the length and width of the object 104 . Accordingly, the estimated position and/or orientation of the object 104 is only slightly adjusted.
  • the respective estimation errors, i. H. the respective covariances K of the reference points 212 ′ coming into question as preferred reference points 220 are marked in FIGS. 3 and 4 by way of example with larger circles or ellipses.
  • the object frame generation algorithm can perform the following steps, for example:
  • the preferred reference point 220 can be calculated, for example, by calculating a weighted sum of all possibilities, as described below.
  • the information required for updating the estimated position and/or orientation of the object 104 is calculated here to a certain extent from a weighted sum of a plurality of reference points 212 and 214, respectively. It should be noted here that the weighted sum is not calculated from the reference points 212 or 214 per se, but from the associated reference point distances in the form of an innovation vector. The merged result can then be linked to any reference point 212 or 214.
  • Probabilistic data association was originally developed to solve the measurement association problem in the presence of erroneous measurement values. To do this, a weighted sum of all measurements found in the association gate is calculated. The weight depends on the deviations between the expected object and the measurement. The better the expectation and measurements match, the higher the weight for this measurement. Finally, a merged pseudo-measurement or innovation vector is calculated, which can be used for updating, for example in a Kalman filter. Furthermore, a fused association probability is calculated, which expresses the overall success. If at least one measurement agrees well, a high value, close to 1, is returned. If none of the measurements agree with the expectation, the probability of the fused association is approximately 0.
  • the likelihood ratio for each measurement j as well as the Probabilities of no detection with index 0 can be calculated as follows.
  • y stands for the measurement
  • R for its covariance
  • x for the expected or predicted state
  • P for its covariance
  • H for the measurement matrix
  • V c for the measurement volume
  • ß j for the error detection density
  • e for the innovation vector
  • S for its covariance.
  • p j is the probability that the measurement; belongs to the predicted object, and p 0 the probability that no measurement belongs to the predicted object.
  • the associated covariance matrices are calculated in advance.
  • the first reference points 212 are used as the predicted state x. From this, the associated innovation vectors e are calculated.
  • the PDA algorithm is executed as described above.
  • the final results are the merged innovation vector and the merged covariance matrix.
  • the innovation vector is added to any first reference point 212, 212' of the first object frame 208 and used for the position update. It is advantageous to select the closest or best fitting first reference point 212, 212' for the update.
  • the fused probability can be used not only to weight the update, but also as a rejection criterion. If the fusion probability is too low, the update is performed with the center of the respective object frame with correspondingly high covariances.
  • Exemplary weights calculated by the PDA algorithm for each of the selected first reference points 212' are boxed in FIG. 4 with rectangles.
  • the preferred reference point 220 results here from the weighted sum of the innovation vectors. In this example, the preferred reference point 220 is assigned to the frame center of the first object frame 208 .
  • controller 106 may be configured to determine a speed of object 104 .
  • the bundling module 200 determines, based on the radar data 108, a radial or Doppler velocity v d and an associated azimuth angle ⁇ for at least two different reflection locations 110 of the detected object 104.
  • an angle parameter estimation module 222 based on the radial velocities v d and assigned to the object 104 Azimuth angles ⁇ parameters of a Doppler velocity profile are estimated, more precisely a first angle parameter C and a second angle parameter S of an angle function 223, which describes the radial velocity v d as a function of the azimuth angle ⁇ as a sinusoidal curve.
  • the angle parameters C,S can then be input into a speed estimation module 224 together with the orientation value ⁇ and pivot point coordinates x R ,y R of an estimated pivot point of the object 104, which can be estimated and provided, for example, by the object frame generation module 206, which consists of the variables mentioned
  • the speed of the object 104 is calculated, for example a translation speed v with a first speed component v x and a second speed component v y and/or a rotation speed ⁇ with respect to the rotation point coordinates x R ,y R .
  • the geometric variables used to determine the speed are illustrated in FIG.
  • control unit 106 can also include a control module 226, which can be configured to calculate from the estimated position, orientation and/or speed of the object 104, ie from the position values d x , d y , the orientation value ⁇ , the first Speed component v x , the second speed component v y and/or the rotational speed ⁇ to generate the control signal 116 and to output it to the actuator system 118 .
  • Control signal 116 can cause actuator system 118, for example, to accelerate, brake, and/or steer vehicle 100 in a corresponding manner.
  • the angle parameters C,S can be estimated from the radar data 108 if at least two reflection locations 110 of the object 104 were measured at different azimuth angles ⁇ .
  • the angle parameters C,S themselves have no physical meaning, but can be used, for example, to directly update a Kalman filter using an appropriate observation model, so that the observation model can estimate the full motion state of the tracked object 104 .
  • the velocity profile in the form of the angle function 223 describes the kinematic expansion of a single rigid body with multiple reflection locations 110 or a group of objects with an identical state of motion, for example a group of stationary objects, with at least two reflection locations 110.
  • the Doppler velocity v d is calculated via the azimuth angle ⁇ examined.
  • the measured velocity curve corresponds to a cosine curve with the two degrees of freedom amplitude and phase shift.
  • the equation for the measured velocity profile is:
  • the sought-after angle parameters C,S can be determined using the least
  • the covariance matrix is about the Jacobian matrix and the measured azimuth angle and Doppler velocity covariance matrix of the two extreme reflection locations through calculated. This formula is also known as the general law of error propagation.
  • p can be calculated directly from the angle parameters C,S, provided that only the two outermost reflection locations, i. H. the two reflection locations with the smallest and largest azimuth angle are taken into account.
  • the two resulting equations can be partially derived.
  • the mere number of reflection sites is not the decisive factor for the accuracy of the angle parameters C,S. Rather, it is decisive that the distance between the outermost places of reflection, i. H. the detected angular section, is large enough and that the measurement errors of azimuth angle and Doppler velocity are sufficiently low. Even if the angle section is relatively small and only a few reflection locations are measured, the angle parameter C can still be estimated relatively accurately if the respective object is placed in the center of the field of view of the radar sensor system 102 .
  • the object frame 208 or 210 is large enough and well aligned around the reflection locations 110, its orientation ⁇ can be used to calculate v x , v y and ⁇ directly from the pre-calculated angle parameters C,S. This is possible with just one radar sensor in just one measurement cycle.
  • the angle parameters C,S can be defined as follows:

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un procédé de détection d'objets (104) dans l'environnement d'un véhicule (100). Le procédé comprend les étapes suivantes : la réception de données radar (108) indiquant les objets (104) ; la détermination d'un premier cadre d'objet (208) pour chaque objet (104) sur la base des données radar (108), le premier cadre d'objet (208) indiquant une position (dx, dy) et/ou une orientation (φ) de l'objet (104), qui sont estimées sur plusieurs étapes temporelles, et une position et/ou une orientation du premier cadre d'objet (208) sont définies par un nombre défini de premiers points de référence (212, 212') dans un système de coordonnées bidimensionnel (202) ; la détermination d'un second cadre d'objet (210) pour chaque objet (104) sur la base des données radar (108), le second cadre d'objet (210) indiquant une position et/ou une orientation de l'objet (104), qui sont mesurées dans une étape de temps actuel, et une position et/ou une orientation du second cadre d'objet (210) sont définies par un nombre défini de seconds points de référence (214) dans le système de coordonnées bidimensionnel (202) ; la détermination de paires de points de référence (216) chacune composée d'un premier point de référence (212, 212') et d'un second point de référence (214) ; la détermination d'une distance de point de référence entre le premier point de référence (212, 212') et le second point de référence (214) de chaque paire de points de référence (216) ; la détermination d'au moins un point de référence préféré (220) sur la base des distances de points de référence ; et la mise à jour de la position estimée (dx, dy) et/ou de l'orientation (φ) de l'objet (104) sur la base dudit point de référence (220) préféré.
PCT/EP2021/085019 2020-12-09 2021-12-09 Procédé et dispositif de détection d'objets dans l'environnement d'un véhicule WO2022122933A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020215504.6 2020-12-09
DE102020215504.6A DE102020215504A1 (de) 2020-12-09 2020-12-09 Verfahren und Vorrichtung zum Detektieren von Objekten in einer Umgebung eines Fahrzeugs

Publications (1)

Publication Number Publication Date
WO2022122933A1 true WO2022122933A1 (fr) 2022-06-16

Family

ID=79185624

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/085019 WO2022122933A1 (fr) 2020-12-09 2021-12-09 Procédé et dispositif de détection d'objets dans l'environnement d'un véhicule

Country Status (2)

Country Link
DE (1) DE102020215504A1 (fr)
WO (1) WO2022122933A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116824213A (zh) * 2023-05-17 2023-09-29 杭州新中大科技股份有限公司 一种基于多视角特征融合的渣土车带泥检测方法及装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022209126A1 (de) 2022-09-02 2024-03-07 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren und Vorrichtung zur Verarbeitung von Radardaten, Radarsystem
DE102022124192A1 (de) 2022-09-21 2024-03-21 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zur Aktualisierung des Objektzustands eines Objektes

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3349033A1 (fr) * 2017-01-13 2018-07-18 Autoliv Development AB Détection d'objet améliorée et estimation de l'état de mouvement pour un système de détection d'environnement de véhicule
US20200191942A1 (en) * 2018-12-18 2020-06-18 Hyundai Motor Company Apparatus and method for tracking target vehicle and vehicle including the same
DE102019109332A1 (de) * 2019-04-09 2020-10-15 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Verarbeitungseinheit zur Ermittlung eines Objekt-Zustands eines Objektes

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3349033A1 (fr) * 2017-01-13 2018-07-18 Autoliv Development AB Détection d'objet améliorée et estimation de l'état de mouvement pour un système de détection d'environnement de véhicule
US20200191942A1 (en) * 2018-12-18 2020-06-18 Hyundai Motor Company Apparatus and method for tracking target vehicle and vehicle including the same
DE102019109332A1 (de) * 2019-04-09 2020-10-15 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Verarbeitungseinheit zur Ermittlung eines Objekt-Zustands eines Objektes

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A. SCHEELK. DIETMAYER: "Tracking Multiple Vehicles Using a Variational Radar Model", IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, vol. 2
D. KELLNERM. BARJENBRUCHJ. KLAPPSTEIN, INSTANTANE BESTIMMUNG DER VOLLSTÄNDIGEN OBJEKTBEWEGUNG AUSGEDEHNTER OBJEKTE MITTELS HOCHAUFLÖSENDEM RADAR, vol. 30

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116824213A (zh) * 2023-05-17 2023-09-29 杭州新中大科技股份有限公司 一种基于多视角特征融合的渣土车带泥检测方法及装置

Also Published As

Publication number Publication date
DE102020215504A1 (de) 2022-06-09

Similar Documents

Publication Publication Date Title
WO2022122933A1 (fr) Procédé et dispositif de détection d'objets dans l'environnement d'un véhicule
EP3177498B1 (fr) Procédé servant à produire une carte des environs d'une zone des environs d'un véhicule automobile, système d'assistance au conducteur ainsi que véhicule automobile
DE102019115240A1 (de) Systeme und verfahren für die anwendung von landkarten zur verbesserung von objektverfolgung, fahrstreifenzuordnung und klassifizierung
DE102020105192B4 (de) Verfahren zum Detektieren bewegter Objekte in einer Fahrzeugumgebung und Kraftfahrzeug
DE102018216999A1 (de) Verfahren, Computerprogramm und Messsystem für die Auswertung von Bewegungsinformationen
DE102020205468A1 (de) Autonomes und/oder assistiertes Kuppeln eines Anhängers unter Berücksichtigung des Höhenprofils des Untergrunds
DE102020117773A1 (de) Verfahren zum ermitteln einer ersatztrajektorie, computerprogrammprodukt, parkassistenzsystem und fahrzeug
EP3809316A1 (fr) Prédiction d'un tracé de route en fonction des données radar
DE102019204408B4 (de) Verfahren zur Bestimmung der Gierrate eines Zielobjekts auf Grundlage der Sensordaten beispielsweise eines hochauflösenden Radars
WO2022194576A1 (fr) Procédé de détermination d'un état de mouvement d'un corps rigide
DE102022101054A1 (de) Verfahren zum Planen eines Pfades für ein wenigstens teilweise automatisiertes Kraftfahrzeug, Computerprogrammprodukt, computerlesbares Speichermedium sowie Assistenzsystem
WO2022129266A1 (fr) Procédé de détection d'au moins un objet d'un environnement au moyen de signaux de réflexion d'un système capteur radar
DE102020116027A1 (de) Verfahren und Vorrichtung zur Ermittlung von Belegungsinformation für einen Umfeldpunkt auf Basis von Radardetektionen
DE102019211327B4 (de) Vorrichtung, Fahrzeug und Verfahren zur verbesserten Multi-Radar-Sensor-Kalibration
DE112020004020T5 (de) Fahrzeugkollisionsbestimmungsvorrichtung
DE102019131334A1 (de) Verfahren zur Spur verfolgen von ausgedehnten Objekten mit wenigstens einer Detektionsvorrich-tung und Detektionsvorrichtung
WO2020001690A1 (fr) Procédé et système de reconnaissance d'obstacles
DE102022204767A1 (de) Verfahren und Vorrichtung zum Verarbeiten von Sensordaten einer Sensorik eines Fahrzeugs
WO2021233674A1 (fr) Procédé de détermination d'une position initiale d'un véhicule
DE102023109228A1 (de) Selbstabschätzende ortungseinrichtung für mobiles objekt
DE102021202019A1 (de) Verfahren und Vorrichtung zum Ermitteln eines Aufenthaltsbereichs eines Objekts
DE102022123176A1 (de) Filtern und aggregieren von detektionspunkten einer radarpunktwolke für ein autonomes fahrzeug
DE102023100670A1 (de) Systeme und verfahren zur erkennung von verkehrsobjekten
DE102021124329A1 (de) Geometrische Charakterisierung eines Fahrspurbegrenzers
DE102021122432A1 (de) Eigenposition-Schätzverfahren

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21835686

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21835686

Country of ref document: EP

Kind code of ref document: A1