CN116413734A - Vehicle tracking detection method and device based on road side sensor - Google Patents

Vehicle tracking detection method and device based on road side sensor Download PDF

Info

Publication number
CN116413734A
CN116413734A CN202111654422.7A CN202111654422A CN116413734A CN 116413734 A CN116413734 A CN 116413734A CN 202111654422 A CN202111654422 A CN 202111654422A CN 116413734 A CN116413734 A CN 116413734A
Authority
CN
China
Prior art keywords
target
target vehicle
targets
projection
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111654422.7A
Other languages
Chinese (zh)
Inventor
林少栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to CN202111654422.7A priority Critical patent/CN116413734A/en
Priority to PCT/EP2022/087856 priority patent/WO2023126390A1/en
Publication of CN116413734A publication Critical patent/CN116413734A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention relates to a vehicle tracking detection method based on a road side sensor, which comprises the following steps: respectively associating a plurality of target vehicles with the point cloud data measured by the road side sensor so as to obtain a plurality of targets aggregated by the point cloud data; and determining whether to merge and/or fine tune two of the plurality of targets by projecting the plurality of targets in a direction parallel to the plurality of sides of the two targets. The invention also relates to a vehicle tracking detection device, a computer storage medium, a computer program product and a radar tracking system based on the road side sensor.

Description

Vehicle tracking detection method and device based on road side sensor
Technical Field
The present invention relates to the field of vehicle tracking and detection, and more particularly, to a vehicle tracking detection method and apparatus based on a roadside sensor, a computer storage medium, a computer program product, and a radar tracking system.
Background
The road side perception is to use a plurality of sensors such as a visual sensor, a millimeter wave radar, a laser radar and the like, and combine with edge computing equipment to realize the real-time acquisition of current road traffic participants and road condition information. And through a V2X vehicle-road cooperation technology, the information exchange and instruction control between the vehicles, the people, the roads and the clouds are realized according to the agreed communication protocol and the data interaction standard. The road side perception effectively makes up the perception blind area of the vehicle, provides timely early warning for drivers, realizes the cooperative scheduling of vehicles within a certain range for traffic departments, and can effectively improve the congestion condition of urban road vehicles
In existing roadside sensor (e.g., radar sensor) based tracking systems, the (data) correlation threshold is extended by a constant value, and there is no limitation on the length and width of the target object in the data correlation. Since the roadside sensor is generally installed at a roadside (at a high installation position) and is distant from the target object, two vehicles traveling in parallel may be combined in an actual tracking scene and considered as one large object. This is an undesirable situation.
Disclosure of Invention
According to an aspect of the present invention, there is provided a vehicle tracking detection method based on a road side sensor, the method including: respectively associating a plurality of target vehicles with the point cloud data measured by the road side sensor so as to obtain a plurality of targets aggregated by the point cloud data; and determining whether to merge and/or fine tune two of the plurality of targets by projecting the plurality of targets in a direction parallel to the plurality of sides of the two targets.
Additionally or alternatively to the above, in the above method, associating the plurality of target vehicles with the measured point cloud data includes: projecting a length expansion threshold value and a width expansion threshold value of a first target vehicle in the plurality of target vehicles into a rectangular coordinate system taking a connecting line of the road side sensor to the first target vehicle as an axis so as to acquire a projection offset value in the rectangular coordinate system; and calculating a radial threshold value and an angle threshold value under the polar coordinate system of the road side sensor based on the projection offset value in the rectangular coordinate system, so as to determine an association gate for data association.
Additionally or alternatively to the above, in the above method, the projection offset value may be changed based on a posture of the first target vehicle.
Additionally or alternatively to the above, in the above method, calculating a radial threshold value and an angular threshold value in a polar coordinate system of the roadside sensor based on a projected offset value in the rectangular coordinate system includes: and converting the sum of the projection offset value and the adjustable step length into the radial threshold value and the angle threshold value under the polar coordinate system.
Additionally or alternatively to the above, in the above method, associating the plurality of target vehicles and the measured point cloud data, respectively, so as to obtain a plurality of targets aggregated from the point cloud data further includes: setting a maximum length and a maximum width of the plurality of targets.
Additionally or alternatively to the above, in the above method, determining whether to merge and/or fine tune two objects of the plurality of objects by projecting toward directions in which edges of the two objects are parallel includes: determining a projection space between a first target vehicle and a second target vehicle in a plurality of projection directions by projecting toward directions in which a plurality of sides of two of the plurality of targets are parallel; and determining whether to merge and/or fine tune the first and second target vehicles according to the projection space.
Additionally or alternatively to the above, in the above method, the dimensions and the center point of the first target vehicle and the second target vehicle are corrected when a maximum value of the projection space is within a predefined range.
Additionally or alternatively to the above, in the above method, determining whether to merge and/or fine tune two objects of the plurality of objects by projecting toward directions in which edges of the two objects are parallel further includes: when it is determined to merge the first target vehicle and the second target vehicle, the first target vehicle or the second target vehicle is deleted based on the tracking duration, the movement probability, and the existence probability of the first target vehicle and the second target vehicle.
Additionally or alternatively to the above, in the above method, the shape of the plurality of objects aggregated from the point cloud data is a convex polygon or a rectangle.
According to another aspect of the present invention, there is provided a vehicle tracking detection apparatus based on a roadside sensor, the apparatus comprising: the data association module is used for respectively associating a plurality of target vehicles with the point cloud data measured by the road side sensor so as to obtain a plurality of target objects aggregated by the point cloud data; and the target merging module is used for determining whether to merge and/or finely adjust two targets in the plurality of targets in a mode of projecting towards the direction that the edges of the two targets are parallel.
Additionally or alternatively to the above, in the above device, the data association module is configured to: projecting a length expansion threshold value and a width expansion threshold value of a first target vehicle in the plurality of target vehicles into a rectangular coordinate system taking a connecting line of the road side sensor to the first target vehicle as an axis so as to acquire a projection offset value in the rectangular coordinate system; and calculating a radial threshold value and an angle threshold value under the polar coordinate system of the road side sensor based on the projection offset value in the rectangular coordinate system, so as to determine an association gate for data association.
Additionally or alternatively to the above, in the above apparatus, the projection offset value may be changed based on a posture of the first target vehicle.
Additionally or alternatively to the above, in the above device, the data association module is configured to: and converting the sum of the projection offset value and the adjustable step length into the radial threshold value and the angle threshold value under the polar coordinate system.
Additionally or alternatively to the above, in the above device, the data association module is further configured to: setting a maximum length and a maximum width of the plurality of targets.
Additionally or alternatively to the above, in the above apparatus, the target merging module is configured to: determining a projection space between a first target vehicle and a second target vehicle in a plurality of projection directions by projecting toward directions in which a plurality of sides of two of the plurality of targets are parallel; and determining whether to merge and/or fine tune the first and second target vehicles according to the projection space.
Additionally or alternatively to the above, in the above apparatus, the target merging module is configured to correct the sizes and the center points of the first target vehicle and the second target vehicle when a maximum value of the projection space is within a predefined range.
Additionally or alternatively to the above, in the above apparatus, the target merging module is further configured to: when it is determined to merge the first target vehicle and the second target vehicle, the first target vehicle or the second target vehicle is deleted based on the tracking duration, the movement probability, and the existence probability of the first target vehicle and the second target vehicle.
Additionally or alternatively to the above, in the above apparatus, the plurality of objects aggregated from the point cloud data may have a convex polygon or a rectangle in shape.
According to yet another aspect of the invention, there is provided a computer storage medium comprising instructions which, when executed, perform a method as described above.
According to a further aspect of the invention there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method as described above.
According to a further aspect of the present invention there is provided a radar tracking system comprising a vehicle tracking detection device as described above.
The vehicle tracking detection scheme based on the road side sensor of the embodiment of the invention determines whether to merge and/or fine tune two targets in the actual tracking scene by means of projection (specifically, projection is performed towards the parallel directions of a plurality of sides (for example, four sides when the targets are rectangular) of the two targets, so that the situation that two vehicles running in parallel are merged into one large object in the actual tracking scene can be reduced or avoided. In addition, the scheme is characterized in that the length expansion threshold value and the width expansion threshold value of the target vehicle are projected into a rectangular coordinate system so as to obtain a projection offset value in the rectangular coordinate system; and calculating a radial threshold value and an angle threshold value under the polar coordinate system of the road side sensor based on the projection offset value. In this way, the threshold value for data association calculated by the projection method can dynamically change along with the gesture of the target vehicle, and the accuracy of vehicle tracking is further improved.
Drawings
The above and other objects and advantages of the present invention will become more fully apparent from the following detailed description taken in conjunction with the accompanying drawings, in which identical or similar elements are designated by the same reference numerals.
FIG. 1 shows a flow diagram of a roadside sensor-based vehicle tracking detection method, in accordance with one embodiment of the present invention;
FIG. 2 shows a schematic structural diagram of a roadside sensor-based vehicle tracking detection device in accordance with an embodiment of the present invention;
FIG. 3 illustrates a schematic diagram of acquiring projection offset values based on a projection method according to one embodiment of the invention; and
fig. 4 shows a schematic view of a projection space according to an embodiment of the invention.
Detailed Description
Hereinafter, a roadside sensor-based vehicle tracking detection scheme according to various exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 shows a flow diagram of a roadside sensor-based vehicle tracking detection method 1000, according to one embodiment of the invention. As shown in fig. 1, the vehicle tracking detection method 1000 based on the roadside sensor includes the steps of:
in step S110, a plurality of target vehicles and point cloud data measured by the roadside sensors are respectively associated so as to obtain a plurality of targets aggregated by the point cloud data; and
in step S120, it is determined whether to merge and/or fine-tune two objects of the plurality of objects by projecting the objects in a direction parallel to the sides of the two objects.
The vehicle infrastructure interconnection system (V2I) is an important component in an automatic driving system, and refers to that vehicle-mounted equipment communicates with road side infrastructure (such as traffic lights, road side sensors, road side units and the like), and the road side infrastructure can acquire information of vehicles in a nearby area and issue various real-time information.
In the context of the present invention, the term "roadside sensor" refers to a sensor for mounting at a roadside (e.g., at a road edge, an intersection, a traffic light, a utility pole, etc.), for collecting target object information, which may include laser radar sensors, millimeter wave radar sensors, visual sensors, and the like. In one embodiment, the roadside sensors collect object information and transmit to an Infrastructure Computing Unit (ICU), where the ICU fuses and filters the information.
The tracking of the target vehicle, that is, target tracking, is to process measurement data sent from a road side sensor (for example, a radar sensor) to obtain state information of the movement of the target vehicle. In target tracking systems, data correlation is a critical step due to the uncertainty of the observed data obtained by the sensors and the complexity of the multi-target tracking environment. The actual sensor system always inevitably has measurement errors and lacks the priori knowledge of the multi-target tracking environment, the tracking number of the targets is often not known a priori, and whether the observed data is from a real target or a false target cannot be known a priori. This makes it difficult to determine the correspondence between the observed data and the real target. Thus, data association (data association) is the process of associating uncertainty observations with tracks.
The purpose of the data correlation is a process of pairing the measurement data Zi (i=1, 2,..n) from a single or multiple sensors with j known or already determined trajectories. In short, all the measurement data are divided into j sets, and the measurement data included in each set is guaranteed to come from the same target with a probability close to 1.
In one embodiment, step S110 includes: projecting a length expansion threshold value and a width expansion threshold value of a first target vehicle in the plurality of target vehicles into a rectangular coordinate system taking a connecting line of the road side sensor to the first target vehicle as an axis so as to acquire a projection offset value in the rectangular coordinate system; and calculating a radial threshold value and an angle threshold value under the polar coordinate system of the road side sensor based on the projection offset value in the rectangular coordinate system, so as to determine an association gate for data association.
As a first step of data correlation, a correlation gate is a precondition to ensure that the measurement trajectory is correctly correlated with the target state estimate. For multi-target systems in clutter environments, a small associated gate may cause the actual measurement of the target to fall outside the gate, resulting in a loss of the target. On the other hand, a larger correlation gate may bring too many irrelevant measurements, which not only increases the computational complexity of the joint probability data correlation algorithm, but also affects the tracking accuracy. Therefore, in order to minimize the interference of irrelevant measurements and increase the probability of correct association, it is necessary to properly control the association gate according to different association situations.
In the embodiment of the invention, when data association is performed, expansion thresholds, namely a length expansion threshold value and a width expansion threshold value, are respectively set for the length and the width of the target vehicle. The threshold values are then projected onto the x and y axes of the rectangular coordinate system, respectively, to obtain projection offset values. Then, a radial threshold value and an angle threshold value under a polar coordinate system of the road side sensor are calculated based on the projection offset value in the rectangular coordinate system, thereby determining an association gate for data association.
In one embodiment, the projection offset value varies based on a pose of the first target vehicle. It can be understood that, unlike the setting of the radial threshold value and the angle threshold value as constants in the prior art, the threshold value for performing data association calculated by the projection method can dynamically change along with the posture of the target vehicle, so that the association degree of the data is further improved, and the accuracy of vehicle tracking is improved.
Referring to fig. 3, a schematic diagram of acquiring a projection offset value based on a projection method according to an embodiment of the present invention is shown. As shown in fig. 3, first, length expansion threshold values 332, 336 and width expansion threshold values 334, 338 are set for the target vehicle 320, and then these threshold values are projected onto the x and y axes of the rectangular coordinate system, respectively, to obtain projection offset values 341, 343, 345, 347. It can be seen that the projection offset value represents the projection deviation of the extended target object 325 from the original target object 320 in the rectangular coordinate system. In the example of fig. 3, the x-axis of the rectangular coordinate system is coaxial with the line connecting the center of the target vehicle 320 and the roadside sensor 310.
In one embodiment, calculating the radial threshold and the angular threshold in the polar coordinate system of the roadside sensor based on the projected offset values in the rectangular coordinate system includes: and converting the sum of the projection offset value and the adjustable step length into the radial threshold value and the angle threshold value under the polar coordinate system. With continued reference to fig. 3, an adjustable step size may be added to the projection offset value, as shown at 342, 344, 346, and 348, before the radial threshold and the angular threshold are calculated/converted. The adjustable step size can be set accordingly according to actual needs.
As shown in fig. 3, in the polar coordinate system of the roadside sensor, the angle threshold value #, and the radial threshold value (R2-R1) may be obtained by performing coordinate transformation on the sum of the projection offset value and the adjustable step size in the rectangular coordinate system.
In one embodiment, step S110 further includes: setting a maximum length and a maximum width of the plurality of targets. For example, by setting the maximum length and maximum width of the target vehicle after cloud aggregation, even if the vehicles travel in parallel and are closely spaced, the probability of correct association can be ensured, reducing the interference of irrelevant measurements.
In one embodiment, step S120 includes: determining a projection space between a first target vehicle and a second target vehicle in a plurality of projection directions by projecting toward directions in which a plurality of sides of two of the plurality of targets are parallel; and determining whether to merge and/or fine tune the first and second target vehicles according to the projection space.
In one or more embodiments, the shape of the plurality of objects aggregated from the point cloud data is a convex polygon or rectangle. For example, when the shape of the object is rectangular, step S120 includes: the projection space between the first target vehicle and the second target vehicle in the four projection directions is determined by projecting in the directions in which the four sides of the two targets are parallel. For another example, when the shape of the object is a convex polygon, then the projection space between the first object vehicle and the second object vehicle in the plurality of projection directions is determined by projecting toward directions in which the plurality of sides (for example, more than four sides) of the two objects are parallel.
Fig. 4 shows a schematic view of a projection space according to an embodiment of the invention. As shown in fig. 4, in the case where the first target vehicle 410 and the adjacent second target vehicle 420 have no overlapping degree (IoU), there is always one direction among directions parallel to both target edges, and projections of the two objects in the direction have a margin, not overlap each other. As shown in fig. 4, there is an overlap of the first target vehicle 410 and the second target vehicle 420 along the projection direction 440, shown as 445, but there is space 435, i.e., projection space, of the first target vehicle 410 and the second target vehicle 420 along the projection direction 430. Thus, in the context of the present invention, the projection space may be used to reflect the mutual spacing of the target objects in the projection direction.
In one embodiment, determining whether to merge and/or fine tune the first and second target vehicles based on the projection space comprises: determining to merge the first and second target vehicles when (the maximum value of) the projection space is smaller than a first threshold; determining to fine tune the size and center point of the first and second target vehicles when the (maximum value of the) projection space is greater than the first threshold but less than a second threshold; and when (the maximum value of) the projection space is larger than the second threshold value, deciding not to merge the first target vehicle and the second target vehicle. It is understood that the size and center of the target vehicle may be affected by the proximity of the two vehicles as they approach each other. Thus, when the projection space falls within a predefined range (e.g., greater than the first threshold but less than the second threshold), correction is required to the size and center point of the commonly compared objects.
In one embodiment, step S120 further includes: when it is determined to merge the first target vehicle and the second target vehicle, the first target vehicle or the second target vehicle is deleted based on the tracking duration, the movement probability, and the existence probability of the first target vehicle and the second target vehicle. For example, under otherwise equal conditions, tracking a longer time with a higher weight has the greater advantage of being retained more preferentially.
In addition, it is easily understood by those skilled in the art that the vehicle tracking detection method based on the road side sensor provided in one or more embodiments of the present invention may be implemented by a computer program. For example, the computer program is embodied in a computer program product that when executed by a processor implements a roadside sensor-based vehicle tracking detection method of one or more embodiments of the present invention. For another example, when a computer storage medium (e.g., a usb disk) storing the computer program is connected to a computer, the computer program may be executed to perform the vehicle tracking detection method based on the road side sensor according to one or more embodiments of the present invention.
Referring to fig. 2, fig. 2 shows a schematic configuration of a roadside sensor-based vehicle tracking detection device 2000 according to an embodiment of the present invention. As shown in fig. 2, the roadside sensor-based vehicle tracking detection device 2000 includes a data association module 210 and a target merge module 220. The data association module 210 is configured to associate a plurality of target vehicles with the point cloud data measured by the roadside sensor, so as to obtain a plurality of targets aggregated by the point cloud data; the object merging module 220 is configured to determine whether to merge and/or fine-tune two objects of the plurality of objects by projecting the objects in a direction parallel to the edges of the two objects.
The vehicle infrastructure interconnection system (V2I) is an important component in an automatic driving system, and refers to that vehicle-mounted equipment communicates with road side infrastructure (such as traffic lights, road side sensors, road side units and the like), and the road side infrastructure can acquire information of vehicles in a nearby area and issue various real-time information.
In the context of the present invention, the term "roadside sensor" refers to a sensor for mounting at a roadside (e.g., at a road edge, an intersection, a traffic light, a utility pole, etc.), for collecting target object information, which may include laser radar sensors, millimeter wave radar sensors, visual sensors, and the like. In one embodiment, the roadside sensors collect object information and transmit to an Infrastructure Computing Unit (ICU), where the ICU fuses and filters the information.
The tracking of the target vehicle, that is, target tracking, is to process measurement data sent from a road side sensor (for example, a radar sensor) to obtain state information of the movement of the target vehicle. In target tracking systems, data correlation is a critical step due to the uncertainty of the observed data obtained by the sensors and the complexity of the multi-target tracking environment. The actual sensor system always inevitably has measurement errors and lacks the priori knowledge of the multi-target tracking environment, the tracking number of the targets is often not known a priori, and whether the observed data is from a real target or a false target cannot be known a priori. This makes it difficult to determine the correspondence between the observed data and the real target. Thus, data association (data association) is the process of associating uncertainty observations with tracks.
The purpose of the data correlation is a process of pairing the measurement data Zi (i=1, 2,..n) from a single or multiple sensors with j known or already determined trajectories. In short, all the measurement data are divided into j sets, and the measurement data included in each set is guaranteed to come from the same target with a probability close to 1.
In one embodiment, the data association module 210 is configured to: projecting a length expansion threshold value and a width expansion threshold value of a first target vehicle in the plurality of target vehicles into a rectangular coordinate system taking a connecting line of the road side sensor to the first target vehicle as an axis so as to acquire a projection offset value in the rectangular coordinate system; and calculating a radial threshold value and an angle threshold value under the polar coordinate system of the road side sensor based on the projection offset value in the rectangular coordinate system, so as to determine an association gate for data association.
As a first step of data correlation, a correlation gate is a precondition to ensure that the measurement trajectory is correctly correlated with the target state estimate. For multi-target systems in clutter environments, a small associated gate may cause the actual measurement of the target to fall outside the gate, resulting in a loss of the target. On the other hand, a larger correlation gate may bring too many irrelevant measurements, which not only increases the computational complexity of the joint probability data correlation algorithm, but also affects the tracking accuracy. Therefore, in order to minimize the interference of irrelevant measurements and increase the probability of correct association, it is necessary to properly control the association gate according to different association situations.
In the embodiment of the invention, when data association is performed, expansion thresholds, namely a length expansion threshold value and a width expansion threshold value, are respectively set for the length and the width of the target vehicle. The threshold values are then projected onto the x and y axes of the rectangular coordinate system, respectively, to obtain projection offset values. Then, a radial threshold value and an angle threshold value under a polar coordinate system of the road side sensor are calculated based on the projection offset value in the rectangular coordinate system, so that an association gate (which is expressed as a fan shape, and the lengths of two circular arcs are determined by the distance R and the included angle theta from the sensor) for data association is determined.
In one embodiment, the projection offset value varies based on a pose of the first target vehicle. It can be understood that, unlike the setting of the radial threshold value and the angle threshold value as constants in the prior art, the threshold value for performing data association calculated by the projection method can dynamically change along with the posture of the target vehicle, so that the association degree of the data is further improved, and the accuracy of vehicle tracking is improved.
Referring to fig. 3, a schematic diagram of acquiring a projection offset value based on a projection method according to an embodiment of the present invention is shown. As shown in fig. 3, first, length expansion threshold values 332, 336 and width expansion threshold values 334, 338 are set for the target vehicle 320, and then these threshold values are projected onto the x and y axes of the rectangular coordinate system, respectively, to obtain projection offset values 341, 343, 345, 347. It can be seen that the projection offset value represents the projection deviation of the extended target object 325 from the original target object 320 in the rectangular coordinate system. In the example of fig. 3, the x-axis of the rectangular coordinate system is coaxial with the line connecting the center of the target vehicle 320 and the roadside sensor 310.
In one embodiment, the data association module 210 is configured to: and converting the sum of the projection offset value and the adjustable step length into the radial threshold value and the angle threshold value under the polar coordinate system. With continued reference to fig. 3, an adjustable step size may be added to the projection offset value, as shown at 342, 344, 346, and 348, before the radial threshold and the angular threshold are calculated/converted. The adjustable step size can be set accordingly according to actual needs.
As shown in fig. 3, in the polar coordinate system of the roadside sensor, the angle threshold value #, and the radial threshold value (R2-R1) may be obtained by performing coordinate transformation on the sum of the projection offset value and the adjustable step size in the rectangular coordinate system.
In one embodiment, the data association module 210 may be further configured to: setting a maximum length and a maximum width of the plurality of targets. For example, by setting the maximum length and maximum width of the target vehicle after cloud aggregation, even if the vehicles travel in parallel and are closely spaced, the probability of correct association can be ensured, reducing the interference of irrelevant measurements.
In one embodiment, the target merge module 220 is configured to: determining a projection space between a first target vehicle and a second target vehicle in a plurality of projection directions by projecting toward directions in which a plurality of sides of two of the plurality of targets are parallel; and determining whether to merge and/or fine tune the first and second target vehicles according to the projection space.
In one or more embodiments, the shape of the plurality of objects aggregated from the point cloud data is a convex polygon or rectangle. For example, when the shape of the object is rectangular, the object merging module 220 is configured to determine a projection space between the first object vehicle and the second object vehicle in four projection directions by projecting toward directions in which four sides of the two objects are parallel. For another example, when the shape of the object is a convex polygon, then the object merging module 220 is configured to determine a projection space between the first object vehicle and the second object vehicle in a plurality of projection directions by projecting toward directions in which a plurality of sides (e.g., more than four sides) of the two objects are parallel.
Fig. 4 shows a schematic view of a projection space according to an embodiment of the invention. As shown in fig. 4, in the case where the first target vehicle 410 and the adjacent second target vehicle 420 have no overlapping degree (IoU), there is always one direction among directions parallel to both target edges, and projections of the two objects in the direction have a margin, not overlap each other. As shown in fig. 4, there is an overlap of the first target vehicle 410 and the second target vehicle 420 along the projection direction 440, shown as 445, but there is space 435, i.e., projection space, of the first target vehicle 410 and the second target vehicle 420 along the projection direction 430. Thus, in the context of the present invention, the projection space may be used to reflect the mutual spacing of the target objects in the projection direction.
In one embodiment, the target merge module 220 is configured to: determining to merge the first and second target vehicles when (the maximum value of) the projection space is smaller than a first threshold; determining to fine tune the size and center point of the first and second target vehicles when the (maximum value of the) projection space is greater than the first threshold but less than a second threshold; and when (the maximum value of) the projection space is larger than the second threshold value, deciding not to merge the first target vehicle and the second target vehicle. It is understood that the size and center of the target vehicle may be affected by the proximity of the two vehicles as they approach each other. Thus, when the projection space falls within a predefined range (e.g., greater than the first threshold but less than the second threshold), correction is required to the size and center point of the commonly compared objects.
In one embodiment, the target merge module 220 may be further configured to: when it is determined to merge the first target vehicle and the second target vehicle, the first target vehicle or the second target vehicle is deleted based on the tracking duration, the movement probability, and the existence probability of the first target vehicle and the second target vehicle. For example, under otherwise equal conditions, tracking a longer time with a higher weight has the greater advantage of being retained more preferentially.
In one or more embodiments, the apparatus 2000 described above may be integrated in a variety of radar tracking systems, as the invention is not limited in this regard.
In summary, the vehicle tracking detection scheme based on the roadside sensor according to the embodiments of the present invention determines whether to merge and/or fine tune two objects in the actual tracking scene by means of projection (specifically, projection towards a direction parallel to a plurality of sides (for example, four sides when the object is rectangular) of the two objects, so as to reduce or avoid a situation that two vehicles running in parallel are merged into one large object in the actual tracking scene. In addition, the scheme is characterized in that the length expansion threshold value and the width expansion threshold value of the target vehicle are projected into a rectangular coordinate system so as to obtain a projection offset value in the rectangular coordinate system; and calculating a radial threshold value and an angle threshold value under the polar coordinate system of the road side sensor based on the projection offset value. In this way, the threshold value for data association calculated by the projection method can dynamically change along with the gesture of the target vehicle, and the accuracy of vehicle tracking is further improved.
While the above description describes only some of the embodiments of the present invention, those of ordinary skill in the art will appreciate that the present invention can be embodied in many other forms without departing from the spirit or scope thereof. Accordingly, the present examples and embodiments are to be considered as illustrative and not restrictive, and the invention is intended to cover various modifications and substitutions without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (15)

1. A vehicle tracking detection method based on a roadside sensor, the method comprising:
respectively associating a plurality of target vehicles with the point cloud data measured by the road side sensor so as to obtain a plurality of targets aggregated by the point cloud data; and
whether to merge and/or fine tune two objects of the plurality of objects is determined by projecting the objects in a direction parallel to the sides of the objects.
2. The method of claim 1, wherein associating a plurality of target vehicles with measured point cloud data comprises:
projecting a length expansion threshold value and a width expansion threshold value of a first target vehicle in the plurality of target vehicles into a rectangular coordinate system taking a connecting line of the road side sensor to the first target vehicle as an axis so as to acquire a projection offset value in the rectangular coordinate system; and
and calculating a radial threshold value and an angle threshold value under the polar coordinate system of the road side sensor based on the projection offset value in the rectangular coordinate system, so as to determine an association gate for data association.
3. The method of claim 2, wherein the projection offset value varies based on a pose of the first target vehicle.
4. The method of claim 2, wherein calculating radial and angular thresholds in a polar coordinate system of the roadside sensor based on projected offset values in the rectangular coordinate system comprises:
and converting the sum of the projection offset value and the adjustable step length into the radial threshold value and the angle threshold value under the polar coordinate system.
5. The method of claim 1, wherein correlating the plurality of target vehicles and the measured point cloud data, respectively, to obtain a plurality of targets aggregated from the point cloud data further comprises:
setting a maximum length and a maximum width of the plurality of targets.
6. The method of claim 1, wherein determining whether to merge and/or fine tune two of the plurality of targets by projecting toward a direction in which edges of the two targets are parallel comprises:
determining a projection space between a first target vehicle and a second target vehicle in a plurality of projection directions by projecting toward directions in which a plurality of sides of two of the plurality of targets are parallel; and
and determining whether to merge and/or finely adjust the first target vehicle and the second target vehicle according to the projection space.
7. The method of claim 6, wherein the dimensions and center points of the first and second target vehicles are corrected when the maximum value of the projection space is within a predefined range.
8. The method of claim 6, wherein determining whether to merge and/or fine tune two of the plurality of targets by projecting toward a direction in which edges of the two targets are parallel further comprises:
when it is determined to merge the first target vehicle and the second target vehicle, the first target vehicle or the second target vehicle is deleted based on the tracking duration, the movement probability, and the existence probability of the first target vehicle and the second target vehicle.
9. The method of claim 1, wherein the shape of the plurality of objects aggregated from the point cloud data is convex polygon or rectangle.
10. A roadside sensor-based vehicle tracking detection device, the device comprising:
the data association module is used for respectively associating a plurality of target vehicles with the point cloud data measured by the road side sensor so as to obtain a plurality of target objects aggregated by the point cloud data; and
and the target merging module is used for determining whether to merge and/or finely adjust the two targets by projecting the two targets in a mode of parallel directions of a plurality of sides of the two targets.
11. The device of claim 10, wherein the target merge module is configured to:
determining a projection space between a first target vehicle and a second target vehicle in a plurality of projection directions by projecting toward directions in which a plurality of sides of two of the plurality of targets are parallel; and
and determining whether to merge and/or finely adjust the first target vehicle and the second target vehicle according to the projection space.
12. The apparatus of claim 11, wherein the target merge module is configured to correct the size and center point of the first and second target vehicles when the maximum value of the projection space is within a predefined range.
13. A computer storage medium, characterized in that the medium comprises instructions which, when run, perform the method of any one of claims 1 to 9.
14. A computer program product comprising a computer program which, when executed by a processor, implements the method of any one of claims 1 to 9.
15. A radar tracking system, characterized in that the radar tracking system comprises a vehicle tracking detection apparatus as claimed in any one of claims 10 to 12.
CN202111654422.7A 2021-12-31 2021-12-31 Vehicle tracking detection method and device based on road side sensor Pending CN116413734A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111654422.7A CN116413734A (en) 2021-12-31 2021-12-31 Vehicle tracking detection method and device based on road side sensor
PCT/EP2022/087856 WO2023126390A1 (en) 2021-12-31 2022-12-27 Vehicle tracking and detection method and device based on roadside sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111654422.7A CN116413734A (en) 2021-12-31 2021-12-31 Vehicle tracking detection method and device based on road side sensor

Publications (1)

Publication Number Publication Date
CN116413734A true CN116413734A (en) 2023-07-11

Family

ID=84688305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111654422.7A Pending CN116413734A (en) 2021-12-31 2021-12-31 Vehicle tracking detection method and device based on road side sensor

Country Status (2)

Country Link
CN (1) CN116413734A (en)
WO (1) WO2023126390A1 (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104217208B (en) * 2013-06-03 2018-01-16 株式会社理光 Object detection method and device
WO2021025842A1 (en) * 2019-08-05 2021-02-11 Tellus You Care, Inc. Non-contact identification of multi-person presence for elderly care

Also Published As

Publication number Publication date
WO2023126390A1 (en) 2023-07-06

Similar Documents

Publication Publication Date Title
US11774575B2 (en) Extended object tracking using RADAR
US20230079730A1 (en) Control device, scanning system, control method, and program
US9079587B1 (en) Autonomous control in a dense vehicle environment
JP4665903B2 (en) Axis deviation angle estimation method and apparatus
US9466215B2 (en) Multi-surface model-based tracking
US8676486B2 (en) Vehicular information processing device
WO2018218680A1 (en) Obstacle detection method and device
Ramos et al. Cooperative target tracking in vehicular sensor networks
CN113093178A (en) Obstacle target detection method and device, domain controller and vehicle
WO2022134510A1 (en) Vehicle-mounted bsd millimeter wave radar based method for obstacle recognition at low speed
CN112748421A (en) Laser radar calibration method based on automatic driving of straight road section
CN114063061A (en) Method for monitoring a vehicle by means of a plurality of sensors
CN113985405A (en) Obstacle detection method and obstacle detection equipment applied to vehicle
Schiegg et al. Object Detection Probability for Highly Automated Vehicles: An Analytical Sensor Model.
WO2018010172A1 (en) Vehicle identification method and system
US10576977B2 (en) Vehicle device
CN116413734A (en) Vehicle tracking detection method and device based on road side sensor
US20200192401A1 (en) Method and device for determining a highly-precise position and for operating an automated vehicle
US11823570B2 (en) Traffic management server, and method and computer program for traffic management using the same
CN110967040B (en) Method and system for identifying horizontal deviation angle of sensor
WO2022102371A1 (en) Object detection device and object detection method
JP2020003337A (en) Radar device and signal processing method
JP7325296B2 (en) Object recognition method and object recognition system
WO2022009625A1 (en) Map update device and map update program
WO2023206166A1 (en) Object detection method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication