EP4336445A2 - Mehrobjektverfolgungssystem und vorrichtung zur optimierung seiner bewegungsbahn und optimierungsverfahren - Google Patents

Mehrobjektverfolgungssystem und vorrichtung zur optimierung seiner bewegungsbahn und optimierungsverfahren Download PDF

Info

Publication number
EP4336445A2
EP4336445A2 EP23194429.9A EP23194429A EP4336445A2 EP 4336445 A2 EP4336445 A2 EP 4336445A2 EP 23194429 A EP23194429 A EP 23194429A EP 4336445 A2 EP4336445 A2 EP 4336445A2
Authority
EP
European Patent Office
Prior art keywords
trajectory
motion
motion trajectory
detection results
optimization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23194429.9A
Other languages
English (en)
French (fr)
Inventor
Jiadi ZHANG
Leichen WANG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP4336445A2 publication Critical patent/EP4336445A2/de
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Definitions

  • the present invention generally relates to the technical field of multi-object tracking, particularly to a multi-object tracking system and its motion trajectory optimization apparatus and optimization method. It also relates to a corresponding machine-readable storage medium.
  • Multi-Object Tracking The primary task of Multi-Object Tracking (MOT) technology is to detect multiple target objects in an environment and associate the detected target objects with motion trajectories.
  • Multi-object tracking technology finds wide application in areas such as traffic management, security monitoring, autonomous driving, and robotics.
  • Data processing typically employs data association to correlate data from the same target object and obtain motion trajectories for each target object.
  • Data processing includes online and offline methods. Online data processing often uses a high filtering threshold to eliminate noise data. However, this poses the risk of losing useful information, leading to inaccurate and non-robust target object motion trajectories.
  • Offline data processing involves processing all sensor data without setting any filtering thresholds, which can be computationally complex, inefficient, and unable to yield accurate and robust target object motion trajectories.
  • the present invention provides a motion trajectory optimization solution for a multi-object tracking system, capable of optimizing multiple motion trajectories of multiple target objects obtained from multi-object tracking.
  • a motion trajectory optimization apparatus for a multi-object tracking system.
  • the apparatus includes: an acquisition module configured to acquire target object data from sensors perceiving multiple target objects; a tracking module configured to track the target object data to obtain multiple motion trajectories for the multiple target objects.
  • Each motion trajectory includes multiple detection results of a target object at various instances.
  • Each detection result includes the ID of the target object, detection timestamp, and multiple feature parameters.
  • the multiple feature parameters comprise one or more motion parameters and one or more geometric parameters.
  • an optimization module is configured to perform at least one optimization on the multiple motion trajectories, including:
  • a multi-object tracking system includes: a perception device comprising one or more sensors for perceiving multiple target objects and outputting target object data; and the aforementioned motion trajectory optimization apparatus, communicatively connected to the perception device, for tracking the target object data to obtain multiple motion trajectories for the multiple target objects and optimizing the obtained multiple motion trajectories.
  • a motion trajectory optimization method for a multi-object tracking system is provided.
  • the method is executed by the aforementioned optimization apparatus or the aforementioned multi-object tracking system.
  • the method includes: acquiring target object data containing multiple target objects perceived by sensors; tracking the target object data to obtain multiple motion trajectories for the multiple target objects.
  • Each motion trajectory includes multiple detection results of a target object at various instances.
  • Each detection result includes the ID of the target object, detection timestamp, and multiple feature parameters.
  • the multiple feature parameters comprise one or more motion parameters and one or more geometric parameters.
  • the method involves performing at least one optimization on the multiple motion trajectories, including:
  • a machine-readable storage medium stores executable instructions that, when executed, cause one or more processors to perform the aforementioned motion trajectory optimization method.
  • FIG. 1 schematically illustrates a multi-object tracking system 100 according to an example of the present invention, comprising a perception device 10 and a motion trajectory optimization apparatus 20 (referred to as “optimization apparatus 20" hereinafter).
  • the perception device 10 includes one or more sensors (SENSOR_1, SENSOR_2, SENSOR_3... SENSOR_n) for detecting/perceiving multiple target objects in the environment and outputting target object data (SENSOR DATA).
  • the multiple target objects in the environment can include static target objects (e.g., buildings, plants, obstacles on the road, etc.) as well as dynamic target objects (e.g., moving vehicles, pedestrians or pets in motion, etc.).
  • Embodiments of the present invention involve tracking dynamic multiple target objects, obtaining their motion trajectories, and optimizing the motion trajectories.
  • “dynamic multiple target objects” herein refer to target objects having different positions at different times, i.e., these target objects are in motion.
  • One or more sensors are used to capture the 3D information of the target objects in the environment (e.g., 3D positions, 3D dimensions, velocity, acceleration, angular velocity, and angular acceleration).
  • One or more sensors may include cameras and/or lidar sensors.
  • the optimization apparatus 20 is communicatively connected to the perception device 10 via wired and/or wireless means to obtain target object data from the perception device 10.
  • the optimization apparatus processes the target object data to obtain optimized motion trajectories of the target objects.
  • the optimization apparatus 20 and the perception device 10 can be located and configured according to specific application scenarios. For example, in a case where examples of the present invention are applied in a traffic scenario, multiple target objects may include multiple traffic participants, such as multiple moving vehicles.
  • the perception device 10 can include one or more vehicle-mounted sensors and/or one or more roadside sensors.
  • the optimization apparatus 20 can be located in the vehicle's ECU, roadside units, or cloud servers.
  • multiple target objects may include multiple mobile delivery trucks in the factory environment.
  • the perception device 10 can include one or more sensors installed on the robot.
  • the optimization apparatus 20 can be located in the robot's controller, central controller in the factory environment, or edge server.
  • the optimization apparatus 20 may include an acquisition module 21, a tracking module 22, and an optimization module 23.
  • the optimization apparatus 20 and its modules can be implemented using hardware, software, or a combination of both.
  • Hardware implementation can be done using one or more dedicated integrated circuits (ASICs), digital signal processors (DSPs), data signal processing devices (DSPDs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, electronic units designed to perform their functions, or combinations thereof.
  • ASICs dedicated integrated circuits
  • DSPs digital signal processors
  • DSPDs data signal processing devices
  • PLDs programmable logic devices
  • FPGAs field-programmable gate arrays
  • processors controllers, microcontrollers, microprocessors, electronic units designed to perform their functions, or combinations thereof.
  • microcode, program code, or code segments can be utilized, stored in machine-readable storage media such as storage components.
  • modules within the optimization apparatus 20 is functional and not intended to limit their physical locations or implementations.
  • these modules can be located on the same chip or circuit, or they can be distributed across different chips or circuits.
  • the optimization apparatus 20 is implemented to include memory and a processor.
  • the memory stores instructions that, when executed by the processor, implement the trajectory optimization method according to the examples of the present invention.
  • FIG. 2 illustrates a motion trajectory processing process 200 according to one embodiment of the present invention.
  • the process 200 can be implemented by the aforementioned optimization apparatus 20 or the aforementioned system 100, as the descriptions above about the optimization apparatus 20 and system 100 are equally applicable here.
  • acquisition module 21 acquires target object data, which includes multiple target objects sensed by one or more sensors.
  • tracking module 22 tracks the acquired target object data to obtain multiple motion trajectories of multiple target objects (in the subsequent description, motion trajectories may be referred to as "trajectories").
  • the tracked target object data includes associating data from the same target object.
  • Each motion trajectory includes multiple detection results of the target object at multiple time instances. These time instances can include equidistant or non-equidistant time instances.
  • Each detection result comprises: the ID of the target object (i.e., a unique identifier for the target object), detection timestamp (i.e., the moment of detection), and multiple feature parameters of the target object. These feature parameters include one or more motion parameters (such as motion speed and acceleration in the x, y, and z directions) and one or more geometric parameters (such as length, width, and height).
  • Each detection result may also include a perception confidence level indicating the reliability of sensor perception performance, which is derived based on the sensing capabilities of the sensor. Higher perception confidence levels indicates higher credibility in perception for that detection result. This perception confidence level is included in the target object data, and the present invention does not limit how this perception confidence level is obtained.
  • feature parameters including motion and geometric parameters can be represented using a 3D model of the target object.
  • This 3D model may include a kinematic model and a geometric model.
  • the kinematic model may employ quadratic motion models such as Constant Turn Rate and Velocity (CTRV) or Constant Turn Rate and Acceleration (CTRA), or linear motion models such as Constant Velocity (CV) and Constant Acceleration (CA).
  • CTRV Constant Turn Rate and Velocity
  • CTRA Constant Turn Rate and Acceleration
  • CV Constant Velocity
  • CA Constant Acceleration
  • the geometric model contains 3D information describing the shape of the target object, like its length, width, and height.
  • This example of the 3D model is particularly suitable for vehicles traveling on roads.
  • the 3D model can be adaptively adjusted, for example, by adding a vertical motion speed Vz or omitting the yaw angle parameter ⁇ .
  • optimization module 23 performs at least one optimization on multiple motion trajectories, including:
  • each trajectory contains detection results with only one target object ID, where each detection result on the trajectory is highly similar, the trajectory is continuous, and the geometric parameters in each detection result on the trajectory are the same.
  • a target object does not disappear abruptly, size changes should not occur, adjacent time instances should have similar motion states, and the trajectory of one or more other target objects should not be present in the motion trajectory.
  • extreme cases might occur, such as collision deformation of the target object or sudden removal, where the motion trajectory might not conform to the above rules.
  • the examples of the present invention do not encompass such extreme scenarios.
  • target objects may be temporarily or frequently occluded, which, from the perspective of sensor perception, results in the target object "disappearing" from the target object data and reappearing.
  • detection results deviate to varying degrees from the trajectory (i.e., global outlier detection results or local outlier detection results), and segments of other target objects' trajectories are included in the trajectory (i.e., ID-switching trajectory segments).
  • sensors exhibit different detection accuracies for target objects at varying distances, leading to the following issues in tracked motion trajectories: inconsistent geometric parameters in multiple detection results of the same trajectory.
  • the trajectory can be optimized according to the optimization strategy of the present invention.
  • a global outlier detection result refers to a detection result where two or more feature parameters in its multiple feature parameters deviate as "outliers," i.e., they deviate from corresponding two or more feature parameters of other detection results on the trajectory.
  • global outlier detection results refer to those detection results that deviate significantly from the trajectory.
  • a situation of global outlier detection results could be when two or more parameters in the above 3D model are "outliers," or when every parameter in the above 3D model is an "outlier.”
  • a local outlier detection result refers to a detection result where one feature parameter in its multiple feature parameters deviates as an "outlier," i.e., it deviates from a corresponding feature parameter of other detection results on the trajectory.
  • local outlier detection results refer to those detection results that deviate to a lesser extent from the trajectory. For example, a situation of local outlier detection results could be when one parameter in the above 3D model is an "outlier.”
  • outlier encompasses the meaning of isolation, novelty, and deviation. In this invention, these different ways of measurement or different naming of deviations are collectively referred to as outliers.
  • FIGS. 3A-3D For clarity, examples of motion trajectories to be optimized are illustrated in FIGS. 3A-3D .
  • FIG. 3A illustrates an example of a motion trajectory (TRAJECTORY_1) containing local outlier detection results.
  • TS Track Second
  • L the length of the target object
  • W the width of the target object
  • H the height of the target object.
  • the length, width, and height are the same in the detection results at 0 seconds, 1 second, and 2 seconds.
  • the width is different from the widths in other detection results. Therefore, the detection result at 3 seconds is a local outlier detection result.
  • FIG. 3B illustrates an example of a motion trajectory (TRAJECTORY_2) containing global outlier detection results.
  • the same symbols as in FIG. 3A represent the same meanings.
  • the length, width, and height are the same in the detection results at 0 seconds, 1 second, and 3 seconds.
  • all three feature parameters are different from those in other detection results. Therefore, the detection result at 2 seconds is a global outlier detection result.
  • FIG. 3C illustrates an example of a motion trajectory (TRAJECTORY_3) containing inconsistent geometric parameters in the detection results.
  • TRAJECTORY_3 the same symbols as in FIG. 3A represent the same meanings.
  • CS Sensing Confidence Score
  • the length, width, and height are all different in the detection results at 0 seconds, 1 second, 2 seconds, and 3 seconds. The width and height information are missing in the detection result at 2 seconds.
  • FIG. 3D illustrates an example of a motion trajectory (TRAJECTORY_4) containing ID-switched trajectory segments.
  • the target objects in trajectory segments 1 (TRACKLET_1) and 3 (TRACKLET_3) are both cars (CAR), while the target object in trajectory segment 2 (TRACKLET_2) is a truck (TRUCK). Therefore, this motion trajectory contains trajectory segments with ID-switches, specifically, trajectory segment 2.
  • FIG. 4 illustrates an implementation of trajectory optimization for motion trajectories containing global outlier detection results, namely, an implementation of Box 231.
  • each motion trajectory multiple feature parameters (both motion and geometric parameters) from each detection result are collectively taken as samples.
  • the distance between the multiple parameters in this detection result and those in other detection results is calculated as a sample-to-sample distance, for instance, using Mahalanobis distance or Euclidean distance, to obtain inter-sample distances.
  • deviations of the overall motion and geometric parameters are computed, such as deviations in the length, width, height, lateral and longitudinal velocity of the target object.
  • the self-supervised global outlier detection network model is trained based on the calculated inter-sample distances to obtain global implicit features.
  • These global implicit features represent the degree of similarity between the multiple feature parameters of a detection result in a motion trajectory and those of other detection results, on an overall level. This degree of similarity can be represented in various forms, such as binary values like 1 or 0, metric values like Euclidean distance, or probabilistic values like percentages.
  • Using a self-supervised learning model has advantages, such as not requiring labeled sample data for training. By mining the intrinsic features of the data, relationships (such as similarity) between samples are discovered. This reduces the extensive work of manual labeling and avoids issues of incorrect ground truth labels that may arise during the manual labeling process.
  • the obtained global implicit features and all detection results of the trajectory are input to the global outlier detection classifier (referred to as the "global classifier") to output classification results.
  • the global classifier categorizes all detection results into two classes: global outlier detection results and non-global outlier detection results.
  • each test result can be obtained as a global outlier detection results or a non-global outlier detection results.
  • the output of the global classifier includes two classes: positive and negative. Positive represents potential global outlier detection results, while negative represents non-global outlier detection results. Conversely, using negative to represent potential global outlier detection results and positive to represent non-global outlier detection results is also feasible.
  • the global classifier can be a trained Supervised Learning Model with the ability to classify input detection results into global or non-global outlier detection results based on global implicit features.
  • each detection result's multiple feature parameters are taken as a sample feature x(t), and nonlinear encoding is applied to obtain encoded feature z(t).
  • x(t) is considered a positive sample feature.
  • self-regressive processing is performed on feature z(t) (for example, predicting the current feature based on the previous moment's feature and iterating in this manner) to obtain trajectory feature c(t).
  • trajectory feature c(t) can be understood as a high-dimensional feature describing the trajectory in the time domain.
  • a noise function f(n) is added to the positive sample feature x(t) to compute x(t) + f(n) and obtain the negative sample feature y(t).
  • Calculating the negative sample feature can be understood as adding a disturbance f(n) to the positive sample feature x(t) to obtain the negative sample feature y(t).
  • the self-supervised global outlier detection network model is trained using the positive sample feature x(t), the negative sample feature y(t), and the trajectory feature c(t) to obtain the degree of similarity between x(t) and c(t).
  • Training the self-supervised global outlier detection network model can include sampling the distance between the positive sample feature x(t) and the trajectory feature c(t) and the distance between the negative sample feature y(t) and the trajectory feature c(t).
  • training the self-supervised global outlier detection network model can involve optimizing the loss function to minimize its value.
  • the loss function includes the calculated distances between the positive sample feature x(t) and the trajectory feature c(t) and between the negative sample feature y(t) and the trajectory feature c(t).
  • the obtained degree of similarity is compared to a predetermined global degree of similarity threshold. If the obtained degree of similarity is less than the global degree of similarity threshold, the detection result is determined as a global outlier detection result. If the obtained degree of similarity is greater than or equal to the global degree of similarity threshold, the detection result is determined as a non-global outlier detection result.
  • the multiple parameters of the determined global outlier detection results are all set to default values (e.g., all parameters in the multiple parameters are set to zero). Then, a smoothing algorithm is used to predict the smoothing values of these multiple parameters. That is, the predicted reasonable values of these multiple parameters at the detection time of the global outlier detection results are generated using the smoothing algorithm.
  • the smoothing algorithm encompasses various methods, such as fixed-point smoothing, fixed-lag smoothing, or fixed-interval smoothing.
  • the smoothing algorithm can also employ linear interpolation, where the current time's interpolated mean is calculated based on the values of the preceding and succeeding time instances.
  • smoothing values can be understood as parameter values that are closer to the trajectory compared to the "deviated” parameters. For example, when velocity parameters and acceleration parameters are deviated parameters in the global outlier detection results, the smoothing value for the deviated velocity parameter is closer to the velocity parameter values in other detection results within the trajectory; similarly, for the deviated acceleration parameter, the smoothing value is closer to the acceleration parameter values in other detection results within the trajectory.
  • the present disclosure provides an embodiment of trajectory optimization for motion trajectories containing local outlier detection results, as illustrated in FIG. 5 , which represents an implementation of box 232.
  • each feature parameter from multiple feature parameters in each detection result i.e., each feature parameter from motion and geometric parameters
  • the distance between the corresponding feature parameter in each detection result and the same feature parameter in other detection results is calculated, using methods such as Mahalanobis distance or Euclidean distance, to obtain inter-sample distances.
  • the deviation of each parameter from motion and geometric parameters is computed individually. For example, the separate deviations of the length, height, or width of the target object are calculated.
  • the self-supervised local outlier detection network model is trained based on the computed distances to obtain local implicit features. These features represent the degree of similarity between a feature parameter in a detection result and the corresponding feature parameter in other detection results on the same trajectory. It's worth noting that this degree of similarity is based on the degree of similarity of individual feature parameters, whereas the degree of similarity in box 2312 is based on multiple feature parameters.
  • the use of a Self-supervised Learning Module is advantageous, eliminating the need to label training samples and instead discovering relationships (e.g., similarity) between samples by mining their inherent features. This approach saves significant labeling efforts and avoids errors introduced by incorrect Ground Truth labels during the manual labeling process.
  • the obtained local implicit features are used to identify local outlier detection results on the motion trajectory, along with the outlier feature parameter responsible for the deviation. For example, within multiple parameters containing motion and geometric features, the parameter causing the deviation, such as length, is determined.
  • the obtained local implicit features and all detection results of the trajectory are fed into a local outlier detection classifier (referred to as "local classifier") to obtain classification results.
  • local classifier a local outlier detection classifier
  • each detection result is classified as either local or non-local outlier detection result.
  • the output of the local classifier includes two classes: positive and negative, where positive indicates potential local outlier detection results and negative indicates non-local outlier detection results.
  • using negative to represent potential local outlier detection results and positive for non-local outlier detection results is also valid.
  • the local classifier can be a trained Supervised Learning Module, capable of classifying input detection results into local or non-local outlier detection results based on local implicit features.
  • the output classifies as a local outlier detection result, it can also identify the outlier feature parameter.
  • boxes 2321-2323 can be realized through the following steps. Firstly, each feature parameter of every detection result is treated as a sample feature x(t) and non-linearly encoded to obtain encoded feature z(t), considering x(t) as positive sample features. Next, self-autoregressive processing is applied to z(t), predicting the current feature using the previous moment's feature iteratively to obtain trajectory feature c(t). c(t) can be understood as high-dimensional temporal features describing the trajectory.
  • a noise function f(n) is added to positive sample feature x(t), i.e., computing x(t) + f(n), to obtain negative sample feature y(t).
  • the computation of negative sample feature can be understood as perturbing positive sample feature x(t) with disturbance f(n) to obtain negative sample feature y(t).
  • training of the self-supervised local outlier detection network model is carried out using positive sample feature x(t), negative sample feature y(t), and trajectory feature c(t) to obtain the degree of similarity between x(t) and c(t).
  • Training of the self-supervised local outlier detection network model can involve, for example, computing the distances between positive sample feature x(t) and trajectory feature c(t) as well as between negative sample feature y(t) and trajectory feature c(t).
  • training of the self-supervised local outlier detection network model can involve optimizing a loss function to minimize its value, where the loss function includes computed distances between positive sample feature x(t) and trajectory feature c(t) as well as between negative sample feature y(t) and trajectory feature c(t).
  • the obtained degree of similarity is compared with a predetermined local degree of similarity threshold. If the obtained degree of similarity is less than the local degree of similarity threshold, the detection result is determined as a local outlier detection result; if the obtained degree of similarity is greater than or equal to the local degree of similarity threshold, the detection result is determined as a non-local outlier detection result.
  • the predicted smooth value is employed to replace one outlier feature parameter from the local outlier detection results.
  • the deviating parameter from the determined local outlier detection results is set to a default value (e.g., set to zero). Then, a smoothing algorithm is utilized to predict the smooth value of that parameter. In other words, the reasonable value of that parameter at the detection moment of the local outlier detection results is predicted.
  • the smoothing algorithm encompasses various approaches, such as fixed-point smoothing, fixed-lag smoothing, or fixed-interval smoothing.
  • the smoothing algorithm can also incorporate linear interpolation, which computes the current moment's interpolated mean value based on the values of the previous and subsequent moments.
  • FIG. 6 illustrates an exemplary application of the aforementioned optimization process 232.
  • the curve represents the change of the target object's position in the X-direction over time
  • point P represents the determined local outlier point (i.e., the outlier point corresponding to the local outlier detection result) whose deviation is caused by the deviation of the position parameter in the X-direction.
  • the smooth value at the moment of P is predicted using the smoothing algorithm, resulting in point P'.
  • point P is removed, and point P' is integrated into the curve, completing the trajectory optimization for the motion trajectory containing local outlier points. It will be understood that FIG.
  • FIG. 6 depicts an example related to the position parameter in the X-direction, and a similar application of optimization process 232 can be applied to other parameters such as the position parameters in the Y-direction, Z-direction, angles, angular velocity, angular acceleration, and so forth. This is not elaborated upon here.
  • FIG. 7 illustrates one implementation of trajectory optimization for motion trajectories containing inconsistent geometric parameters, i.e., one implementation of box 233.
  • distance confidences are set for each detection result on the motion trajectory. This distance confidence level indicates the credibility of the detection result based on distance factors.
  • Distance confidence can be represented in various forms, such as scores, multiple levels, or percentages, etc.
  • the total confidence level of each detection result is calculated based on the distance confidence and perception confidence for each detection result.
  • both confidence levels are normalized separately and then multiplied, added, or subjected to a weighted average to calculate the total confidence level.
  • the optimal geometric parameters for the motion trajectory are determined based on the calculated total confidence level.
  • the geometric parameters from the detection result with the highest total confidence level are determined as the optimal geometric parameters for the trajectory.
  • the optimal geometric parameters for the trajectory are calculated based on the geometric parameters and their total confidence level from each detection result on the trajectory.
  • the optimal geometric parameters can be calculated as the weighted average of the geometric parameters and total confidence level from the detection results on the trajectory.
  • inconsistent geometric parameters can be present in multiple detection results on the motion trajectory, such as cases involving size errors.
  • the optimal geometric parameters are calculated, and these are employed to optimize the motion trajectory.
  • FIG. 8 illustrates an application example of the optimization process 233 described above.
  • the scenario in FIG. 8 pertains to a traffic scene.
  • the optimization apparatus 20 is set on the ego vehicle (EGO).
  • the target objects are vehicles in adjacent lanes.
  • the motion trajectory includes multiple detection results of the vehicle at various moments.
  • FIG. 8 shows the optimization result of the trajectory depicted in FIG. 3C .
  • the left trajectory in FIG. 8 represents the initial trajectory to be optimized, while the right trajectory represents the trajectory after optimization.
  • the distance between the vehicle in the adjacent lane and the ego vehicle is maximum. At 3 seconds, this distance is minimum.
  • position confidences are successively increased for the detection results at 0 seconds, 1 second, 2 seconds, and 3 seconds, i.e., 0.1, 0.2, 0.3, and 0.5, respectively. Then, using the above optimization calculation, the optimal size of the vehicle is determined, resulting in the optimized trajectory shown in the right part of FIG. 8 .
  • FIG. 9 illustrates one implementation of trajectory optimization for motion trajectories containing ID-switching trajectory segments, i.e., one implementation of box 234.
  • the presence of non-continuous trajectory segments in the motion trajectory is determined to identify if the trajectory contains ID-switching trajectory segments.
  • Non-continuous trajectory segments correspond to ID-switching trajectory segments.
  • effective merged trajectory segments are calculated based on one or multiple motion parameters from detection results at at least one endpoint of the non-continuous trajectory segment.
  • calculating effective merged trajectory segments involves the following steps: Initially, using a constant velocity motion model or constant acceleration motion model, the trajectory is extended from at least one endpoint (assuming the target object is moving at constant velocity or constant acceleration, predicting the object's position in subsequent moments), resulting in multiple predicted points corresponding to multiple detection moments. Then, the merging degree of similarity between each predicted point and the at least one endpoint is computed. This merging degree of similarity can be derived from three aspects of similarity:
  • the sum or average of these three aspects of similarity is computed as the merging similarity between two points. Predicted points with merging similarity greater than a predetermined merging threshold are considered effective merging points. Connecting at least one endpoint with the determined effective merging points results in effective merged trajectory segments.
  • the non-continuous trajectory segments are replaced with the effective merged trajectory segments, obtaining a continuous motion trajectory with only a single target object ID.
  • FIG. 10 illustrates an application example of the above-mentioned optimization process 234.
  • FIG. 10 shows the optimization results of the trajectory to be optimized in FIG. 3D .
  • this trajectory contains non-continuous trajectory segments, namely, ID-switching trajectory segments, denoted as TRACKLET_2.
  • This trajectory contains four endpoints that can be used for trajectory extension, namely, T1-T4. Taking the extension from the endpoint T1 as an example, other endpoints may perform similar processing.
  • T1 e.g., constant velocity model
  • trajectory extension is performed to obtain predicted points P1-P3. Then, the merging similarity between endpoint T1 and each of the predicted points P1-P3 is calculated.
  • T1, P2, and T4 are connected to obtain an effectively merged trajectory segment.
  • the trajectory segment TRACKLET_2 is removed from this trajectory, and the merged trajectory segment is connected to this non-continuous portion to obtain the optimized trajectory.
  • FIG. 10 shows an example of extension from T1, and according to examples of the present invention, extension from one or more endpoints among T1-T4 can be performed, and the process is similar to the above-described process, which is not repeated here.
  • the trajectory optimization processing according to examples of the present invention can be implemented as offline processing, and it is advantageous to implement it as offline processing.
  • offline processing can handle historical data, save the computational power of real-time processing chips, and eliminate strict requirements on processing speed (e.g., real-time processing may require processing a set of data within a certain duration).
  • offline processing can be understood as non-real-time processing, i.e., processing that is not performed simultaneously with the sensor's perception of target objects.
  • sensor-generated target object data can be stored in a memory (not shown), and then the target object data can be retrieved from the memory periodically or based on user requests or triggering signals for offline processing.
  • FIG. 11 illustrates a motion trajectory optimization method 1100 according to an example of the present invention. This method can be executed by the above-mentioned optimization apparatus 20 or system 100, so the descriptions above about the optimization apparatus and system 100 are equally applicable here and are not repeated.
  • step 1101 target object data from sensors perceiving multiple target objects are obtained.
  • the target object data is tracked to obtain multiple motion trajectories of the multiple target objects.
  • Each motion trajectory includes multiple detection results of a target object at multiple time instances.
  • Each detection result includes: the ID of the target object, detection timestamp, and multiple feature parameters.
  • the multiple feature parameters include one or more motion parameters and one or more geometric parameters.
  • step 1103 the following optimizations are performed on the multiple motion trajectories:
  • the present invention also provides a machine-readable storage medium storing executable instructions that, when executed, cause one or more processors to perform the method 1100 as described above.
  • processors can be implemented using electronic hardware, computer software, or any combination thereof. Whether these processors are implemented as hardware or software will depend on the specific application and overall design constraints imposed on the system.
  • the processors, any portions thereof, or any combinations of the processors provided herein can be implemented as microprocessors, microcontrollers, digital signal processors (DSPs), field-programmable gate arrays (FPGAs), programmable logic devices (PLDs), state machines, gate logic, discrete hardware circuits, and other suitable processing components configured to perform various functionalities described in the present invention.
  • DSPs digital signal processors
  • FPGAs field-programmable gate arrays
  • PLDs programmable logic devices
  • state machines gate logic, discrete hardware circuits, and other suitable processing components configured to perform various functionalities described in the present invention.
  • the functionalities of the processors, any portions thereof, or any combinations of the processors provided herein can be implemented as software executed on platforms such as microprocessors, microcontrollers, DSPs, or other suitable platforms.
  • Software can be broadly construed as representing instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, execution threads, processes, functions, and the like.
  • Software can reside on computer-readable media.
  • Computer-readable media can include, for example, storage devices such as magnetic storage devices (e.g., hard drives, floppy disks, magnetic tapes), optical disks, smart cards, flash devices, random-access memory (RAM), read-only memory (ROM), programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), registers, or removable disks.
  • storage is depicted as being separate from the processor in several aspects provided in the present invention, storage may also be located within the processor (e.g., cache or registers).
EP23194429.9A 2022-09-09 2023-08-31 Mehrobjektverfolgungssystem und vorrichtung zur optimierung seiner bewegungsbahn und optimierungsverfahren Pending EP4336445A2 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211101796.0A CN117710408A (zh) 2022-09-09 2022-09-09 多目标跟踪系统及其运动轨迹优化设备和优化方法

Publications (1)

Publication Number Publication Date
EP4336445A2 true EP4336445A2 (de) 2024-03-13

Family

ID=87889439

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23194429.9A Pending EP4336445A2 (de) 2022-09-09 2023-08-31 Mehrobjektverfolgungssystem und vorrichtung zur optimierung seiner bewegungsbahn und optimierungsverfahren

Country Status (2)

Country Link
EP (1) EP4336445A2 (de)
CN (1) CN117710408A (de)

Also Published As

Publication number Publication date
CN117710408A (zh) 2024-03-15

Similar Documents

Publication Publication Date Title
JP7140922B2 (ja) マルチセンサデータ融合方法、及び装置
US10324469B2 (en) System and method for controlling motion of vehicle in shared environment
CN105774805B (zh) 用于估算车道的系统及其方法
Apostoloff et al. Robust vision based lane tracking using multiple cues and particle filtering
CN113715814B (zh) 碰撞检测方法、装置、电子设备、介质及自动驾驶车辆
US8818702B2 (en) System and method for tracking objects
US11782158B2 (en) Multi-stage object heading estimation
Jeong et al. Bidirectional long shot-term memory-based interactive motion prediction of cut-in vehicles in urban environments
CN112526999B (zh) 速度规划方法、装置、电子设备和存储介质
KR102570338B1 (ko) 자동차의 환경에서 타겟 자동차의 궤적을 예측하기 위한 방법 및 시스템
KR102592830B1 (ko) 차량용 센서퓨전 타겟 예측 장치 및 그의 센서 퓨전 타겟 예측 방법과 그를 포함하는 차량
CN110632916B (zh) 行动预测装置以及自动驾驶装置
EP3511740B1 (de) Verfahren, system und computerprogrammprodukt zur bestimmung einer blockade eines sensors mehrerer sensoren eines ego-fahrzeugs
JP5482323B2 (ja) 運転支援装置及びプログラム
CN111707258B (zh) 一种外部车辆监测方法、装置、设备及存储介质
CN116576857A (zh) 一种基于单线激光雷达的多障碍物预测导航避障方法
CN113942524A (zh) 一种车辆行驶控制方法、系统及计算机可读存储介质
Manghat et al. Forward collision prediction with online visual tracking
EP4336445A2 (de) Mehrobjektverfolgungssystem und vorrichtung zur optimierung seiner bewegungsbahn und optimierungsverfahren
CN110497906B (zh) 车辆控制方法、装置、设备和介质
CN112163521A (zh) 一种车辆行驶行为的识别方法、装置及设备
KR20200133122A (ko) 차량 충돌 방지 장치 및 방법
CN115123291A (zh) 一种基于障碍物识别的行为预测方法及装置
US20220155455A1 (en) Method and system for ground surface projection for autonomous driving
US11798295B2 (en) Model free lane tracking system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR