CN114170274A - Target tracking method and device, electronic equipment and storage medium - Google Patents
Target tracking method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114170274A CN114170274A CN202210127985.9A CN202210127985A CN114170274A CN 114170274 A CN114170274 A CN 114170274A CN 202210127985 A CN202210127985 A CN 202210127985A CN 114170274 A CN114170274 A CN 114170274A
- Authority
- CN
- China
- Prior art keywords
- obstacle
- trajectory
- sensor data
- distance
- frames
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention relates to a target tracking method and device, an electronic device and a storage medium. The method comprises the following steps: obtaining obstacle information from each frame of sensor data from a plurality of sensors; carrying out track matching according to the obstacle information; performing trajectory tracking and updating on the matched obstacle and trajectory, wherein performing trajectory matching according to the obstacle information comprises: determining whether the obstacle was previously present; when it is determined that the obstacle exists before, determining whether the obstacle exists in a predetermined number of previous sensor data frames; and in the case that the sensor data frames of the preset number exist, judging whether the distance between the obstacle and the track formed by the sensor data frames of the preset number is smaller than the preset distance, and when the distance is judged to be smaller than the preset distance, confirming that the obstacle is matched with the track formed by the sensor data frames of the preset number.
Description
Technical Field
The present invention relates to an automatic driving technique.
Background
With the progress of artificial intelligence, the automatic driving industry develops rapidly. Perception, decision planning and control form the main links of automatic driving.
The sensing mainly utilizes a vehicle-mounted sensor to accurately detect the surrounding environment in real time and provide basis for automatic driving decision, and is the problem to be solved by the intelligent automobile firstly, and the accuracy and the robustness of the sensing directly determine whether the automatic driving automobile can efficiently pass through a driving road section without accident and according with expected effect.
The existing technology cannot meet the requirements of all-weather application and accurate detection of complex and various obstacle targets. In addition, the matching speed of the obstacles is relatively slow at present, and the tracking of the obstacles is influenced.
Disclosure of Invention
The present invention has been made keeping in mind the above problems occurring in the prior art, and is intended to solve one or more of the problems occurring in the prior art, and to provide at least one advantageous alternative.
According to an aspect of the present invention, there is provided a target tracking method, including: obtaining obstacle information from each frame of sensor data from a plurality of sensors; carrying out track matching according to the obstacle information; performing trajectory tracking on the matched obstacle and trajectory, wherein performing trajectory matching according to the obstacle information comprises: determining whether the obstacle was previously present; when it is determined that the obstacle exists before, determining whether the obstacle exists in a predetermined number of previous sensor data frames; in the case where it is determined that a predetermined number of previous sensor data frames exist, determining whether a distance between the obstacle and a trajectory formed by the predetermined number of sensor data frames is less than a predetermined distance; and when the obstacle is judged to be smaller than the preset distance, determining that the obstacle is matched with the track.
According to another aspect of the present invention, there is provided a target tracking apparatus, comprising an obstacle information acquisition unit that performs spatial synchronization and data fusion on sensor data frames from a plurality of sensors to acquire obstacle information from each sensor data frame; the track matching unit is used for carrying out track matching according to the obstacle information; a trajectory tracking unit tracking and updating the matched obstacle and trajectory, wherein the trajectory matching unit includes: a presence determination unit that determines whether the obstacle has existed before; a continuous presence determination unit that determines whether or not the obstacle has existed in a predetermined number of previous sensor data frames when it is determined that the obstacle has existed before; a distance determination unit that determines whether a distance between the obstacle and a trajectory formed by a predetermined number of previous sensor data frames is smaller than a predetermined distance, in a case where it is determined that the predetermined number of previous sensor data frames are present; and a matching determination unit that determines that the obstacle matches a trajectory formed by a previous predetermined data frame when the distance determination unit determines that the distance between the obstacle and the previous predetermined number of trajectories is less than a predetermined distance.
According to another aspect of the present invention, there is provided an electronic apparatus including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the method of the present invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium on which a device control program is stored, which when executed by a processor implements the method of any one of the present invention.
According to the embodiment of the invention, the pre-matching based on the existing track is performed before the matching algorithm such as Hungarian matching is performed, so that the calculation complexity is reduced, the matching speed can be increased, and the occurrence of accidents is reduced.
According to the embodiment of the invention, the data obtained by various sensors are respectively combined in series, and the data updating frequency of the combined data stream is greater than that of a single sensor or a single sensor, so that the accuracy of the tracking result can be improved.
Drawings
The invention may be better understood with reference to the following drawings. The drawings are only schematic and are non-limiting of the scope of the invention.
FIG. 1 is a schematic flow chart diagram illustrating a target tracking method in accordance with one embodiment of the present invention.
Fig. 2 is a schematic diagram illustrating coordinate system conversion according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating data fusion according to one embodiment of the invention.
FIG. 4 is a diagram illustrating determining whether an obstacle matches an existing trajectory according to one embodiment of the invention.
FIG. 5 is a schematic block diagram illustrating an object tracking device in accordance with one embodiment of the present invention.
Fig. 6 shows a schematic block diagram of a trajectory matching unit according to an embodiment of the present invention.
Detailed Description
The following describes embodiments of the present invention with reference to the drawings. These descriptions are exemplary and are intended to enable one skilled in the art to practice embodiments of the invention, and are not intended to limit the scope of the invention. Nothing in the description is described as being essential to the actual implementation but is irrelevant to the understanding of the invention.
FIG. 1 is a schematic flow chart diagram illustrating a target tracking method in accordance with one embodiment of the present invention. As shown in fig. 1, according to an embodiment of the present invention, the object tracking method according to the present invention first performs spatial synchronization and data fusion on sensor data frames from a plurality of sensors to obtain obstacle information from each sensor data frame in step S100. The radar or camera performs obstacle detection, such as photographing or scanning, in accordance with the respective monitoring cycle. In the context of this specification, each transmission of data by the radar or camera in accordance with the monitoring period is referred to as a sensor data frame. Spatial synchronization and data fusion may be performed to obtain obstacle information. The obstacle may be a vehicle, and the various sensors may include millimeter wave radar, a camera, laser radar, and ultrasonic radar, among others. According to one embodiment, spatial synchronization is accomplished by unifying the measured data of the various sensors in their respective sensor coordinate systems all into the host vehicle coordinate system. The self vehicle is a vehicle equipped with these sensors. The target tracking of the invention is finished by the self vehicle. Data measured by various sensors in respective sensor coordinate systems are all unified to a self-vehicle coordinate system, and the coordinate system needs to be converted.
The sensor related to the present invention may include a millimeter wave radar, a camera, a laser radar, an ultrasonic radar, and the like. The various sensors have different characteristics, such as (1) the millimeter wave radar can accurately detect the radial distance and the speed, the detection distance is long, the view field angle is wide, the capability of penetrating smoke, fog and dust is strong, the weather influence is avoided, the azimuth angle resolution is low, the measurement clutter is more, and the target classification cannot be carried out; (2) the camera has higher resolution ratio of an object detection azimuth angle, can classify targets, has less number of false reports of the targets and low cost, but the measurement precision of the target distance is rapidly reduced along with the increase of the distance, and is easy to lose effectiveness under the influence of illumination and rain and fog weather; (3) the laser radar can obtain accurate position information of a target point cloud containing reflection intensity, the multi-line laser radar can even classify objects, but the measurement accuracy becomes poor in rainy, snowy and foggy weather, and experiments show that the detection distance of the laser radar is sharply reduced along with the increase of rainfall. (4) The ultrasonic radar has small volume and stable measurement, but can only obtain radial distance information, and the ultrasonic radar has short detection distance because the ultrasonic wave can be quickly attenuated when being transmitted in the air, so that the ultrasonic radar is multipurpose for a short-distance obstacle detection scene, such as automatic parking. According to the technical scheme, various sensors are comprehensively used, so that data frames of the sensors are uniformly distributed, and the advantages and the disadvantages are brought forward.
Fig. 2 shows a schematic diagram of performing coordinate system conversion according to an embodiment of the invention. As shown in fig. 2, the conversion of the sensor coordinate system to the host vehicle coordinate system requires a 3 × 3 rotation matrix M and a 3 × 1 translation vector t.
In the formula (I), the compound is shown in the specification,is the coordinate under the coordinate system of the self-vehicle,and (3) calculating addition and subtraction of the coordinates in the x direction, the y direction and the z direction according to the installation position of the sensor and the distance between the rear axle of the bicycle, wherein t is a translation vector, M is a 3 x 3 rotation matrix, and the sensor coordinate system is rotated by a certain angle around three coordinate axes. And (4) performing space conversion, namely obtaining coordinates in the coordinate system of the self-vehicle through rotation and translation of the coordinates in the coordinate system of the sensor.
FIG. 3 shows a schematic diagram of data fusion according to an embodiment of the invention. In fig. 3, the triangle and the five-pointed star represent the update times of the two sensor measurement values. As shown in fig. 3, the data fusion is accomplished by serially combining data obtained from the various sensors in chronological order according to a first-come-first-served principle. As can be seen from fig. 3, the data update frequency of the combined data stream after serial combination is greater than that of a single sensor, which means that the state update frequency of the central track is also increased, and the accuracy of the tracking result is also increased.
Then, in step S200, trajectory matching is performed according to the obstacle information. The track matching according to the obstacle information can be completed step by step. Judging whether the barrier is matched with the existing track; and when the obstacles are not matched with the existing tracks, performing track matching by using a Hungarian algorithm.
FIG. 4 is a diagram illustrating determining whether an obstacle matches an existing trajectory according to one embodiment of the invention. As shown in fig. 4, according to one embodiment, whether an obstacle matches an existing trajectory is determined as follows.
First, it is determined whether an obstacle exists before. According to one embodiment, a matching algorithm, such as the Hungarian algorithm, is used to match the list of existing obstacles with the obstacles found in the current frame of the sensor data frame, thereby determining whether an obstacle existed before. The hungarian algorithm is an algorithm that finds the largest match, i.e. the most matching pairs are favored between the two sets. The hungarian algorithm itself is a well-known technique in the art and is therefore not described in detail here. According to an embodiment of the invention, an obstacle Identification (ID) pool may be set first, and when an obstacle performs track initialization, an ID is applied to the ID pool, and when a track is deleted, the ID is released, thereby ensuring that an obstacle corresponds to a unique ID. During the driving process of the self vehicle, the self vehicle continuously keeps a list of the existing obstacles because the obstacles needing to be tracked are continuously found. When the radar or the camera sends data (namely sensor data frames) according to the monitoring period, obstacles exist in the data, and the information of the obstacles is extracted to form a list. Matching the two lists allows a determination of whether an obstacle found in the frame (sensor data frame) was previously present.
Then, when it is determined that the obstacle exists before, it is determined whether or not the obstacle exists in a predetermined number of frames before. According to one embodiment, the time corresponding to the predetermined number of frames is less than half of the shortest sliding distance of the emergency brake of the vehicle, so that enough time can be provided for emergency processing even if accidents such as sudden lane change, emergency brake and the like happen to the previous obstacles.
For example, when the current vehicle speed is 36 km/h (10 m/s), the shortest sliding distance of the emergency brake of the vehicle is assumed to be 10 m, the software running period is 50ms (frame interval), and the number of the frames is 10. In this case, the physical time corresponding to 10 frames is 0.5 second, and the sliding distance is 5 meters, which is half of the shortest sliding distance 10 meters for emergency braking of the vehicle. The above examples are made for the convenience of understanding by those skilled in the art, simplifying the calculations.
In addition, such number of frames should be more than 8 frames. Experiments show that more than 8 frames can be better matched. In the embodiment shown in fig. 4, the predetermined number of frames is 16 frames.
The ID of the obstacle may be compared to a previously saved list of IDs corresponding to obstacles in the predetermined number (16) of frames to determine whether the obstacle was present in all of the previous 16 frames.
Next, in a case where it is judged that the frames exist in the previous predetermined number of frames, matching determination is performed. According to one embodiment, the match determination is made by determining whether the distance between the obstacle and a trajectory formed by a predetermined number of frames is less than a predetermined distance. According to one embodiment, the predetermined distance should be between 0.9m and 1.1m (1 m in the example of the figure). By setting the predetermined distance, the existing tracks can be matched as much as possible, and mismatching is reduced. The predetermined distance is a euclidean distance.
According to one embodiment, the euclidean distance may be calculated as follows: the horizontal and vertical position coordinates of the obstacle in the current frame are (x)sensor,ysensor) The obstacle trajectory formed by the predetermined number of frames also has a corresponding lateral-longitudinal position coordinate (x)track,ytrack) The predetermined distance is calculated as:
according to one embodiment, the obstacle trajectory corresponds to a lateral-longitudinal position coordinate (x)track,ytrack) Is the horizontal and vertical position coordinates of the obstacle in the nearest frame. According to another embodiment, the obstacle trajectory corresponds to a lateral-longitudinal position coordinate (x)track,ytrack) The position coordinate of the position of the obstacle at the current frame time is calculated by a position estimation algorithm according to the state information such as the horizontal and vertical position coordinates of the obstacle in the latest frame. The dead reckoning algorithm is, for example, an extended Kalman filterWave algorithms, etc.
And when the distance between the obstacle in the current frame and the track formed by the frames in the preset number is judged to be smaller than the preset distance, determining that the obstacle is matched with the track formed by the frames in the preset number. And judging whether the obstacle and the maintained obstacle track are the same obstacle, wherein the process is matching, and if the distance judgment condition is met, the process is matching.
Finally, in step S300, the tracks on the match are updated. For the obstacle matched in step S200, the trajectory thereof is updated by using kalman filtering, and a new trajectory is created in the trajectory list for the obstacle that does not match the trajectory.
According to one embodiment, Kalman filtering is employed to update the trajectory as follows.
Step A: predicting the position of the next moment according to the position of the current moment:
in the formula (I), the compound is shown in the specification,is an estimate based on the state at time k-1, Xk-1The state at time k-1. The state refers to a series of signal sets which need maintenance for obstacle tracking and can include but is not limited to longitudinal position, lateral position, longitudinal speed, lateral speed, longitudinal acceleration, lateral acceleration and the like of the obstacle. u. ofkThe controller vector at time k may be zero. A is the state transition matrix and B is the control matrix acting on the controller vector. The two matrices are predetermined and are related to kinematic models (such as constant velocity model, constant acceleration model, constant rate and velocity model, constant rate and acceleration model, etc.) used by the algorithm.
And B: predicting the covariance matrix at the next moment according to the current covariance matrix
In the formula (I), the compound is shown in the specification,based on the covariance matrix of the state estimates at time k-1 to time k,P k-1is the covariance matrix of the state at time k-1 and Q is the covariance matrix of the process noise.
And C: calculating a Kalman gain according to the predicted covariance matrix at the next moment:
in the formula (I), the compound is shown in the specification,K k and H is the Kalman gain at the moment k, H is an observation matrix for mapping the real state space to the observation space, and R is the covariance matrix of observation noise. The actual measured semaphore of the sensor and the above-mentioned state quantity to be maintained are not completely in one-to-one correspondence. The H matrix is a matrix describing the conversion relationship between the two matrixes, is different according to the type of the sensor and the state quantity to be maintained, and can be determined through simple kinematics or dynamics relationship after the type of the sensor is determined, the output of the sensor is determined, and the state quantity to be maintained is determined. The R matrix describes the covariance of the sensor on each output signal quantity, the corresponding practical meaning is the uncertainty of each sensor signal quantity, the uncertainty is obtained according to the measurement uncertainty of each output signal quantity given by the upstream sensor, and the error distribution of the sensor can be tested by carrying a truth value system to replace the uncertainty.
Step D: and updating the predicted position and covariance matrix at the next moment according to the Kalman gain.
Updating the estimator:
updating the covariance matrix:
and I is an identity matrix.
The estimator update and the covariance matrix update are completed, i.e., the trajectory update is completed.
According to one embodiment, updating may further include storing the position coordinates of the obstacle in the current frame as trajectory position coordinates, increasing the number of consecutive frames of the current trajectory, and the like.
According to one embodiment, tracks that have not been updated for more than a predetermined number of frames have no obstacles removed from the track list. According to one embodiment, the predetermined number of frames is from 5 frames to 7 frames, and should be less than the aforementioned predetermined number of frames to determine the continuous presence of an obstacle. This can better avoid erroneous removal of the obstacle.
FIG. 5 is a schematic block diagram illustrating an object tracking device in accordance with one embodiment of the present invention.
As shown in fig. 5, the object tracking device according to an embodiment of the present invention includes an obstacle information acquisition unit 100 that acquires obstacle information from each sensor data frame, which may be performed by, for example, spatially synchronizing and data-fusing sensor data frames from various sensors; a trajectory matching unit 200 for performing trajectory matching according to the obstacle information; and a trajectory tracking unit 300 performing trajectory tracking and updating on the matched obstacle and trajectory.
Fig. 6 shows a schematic block diagram of a trajectory matching unit according to an embodiment of the present invention. As shown in fig. 6, according to one embodiment, the trajectory matching unit 200 includes: a presence determination unit 201 that determines whether the obstacle has existed before; a continuous presence determination unit 202 that determines whether or not there are any of a predetermined number of previous sensor data frames when it is determined that the obstacle has existed before; a distance determination unit 203 that determines whether a distance between the obstacle and a trajectory formed by a predetermined number of frames is smaller than a predetermined distance in a case where it is determined that the predetermined number of frames of sensor data have existed before; and a matching determination unit 204 that, when it is determined that the distance between the obstacle and the trajectory formed by the predetermined number of frames is smaller than the predetermined distance, confirms that the obstacle matches the trajectory formed by the predetermined number of frames.
According to one embodiment, the presence determination unit 201 determines whether an obstacle was previously present by matching a list of existing obstacles with obstacles found in the sensor data frame using the hungarian algorithm.
According to one embodiment, the predetermined number is greater than 8 and the predetermined distance is 0.9m to 1.1 m.
According to one embodiment, the obstacle information acquiring unit 100 performs spatial synchronization by unifying all the data measured by the various sensors with the sensors themselves as the coordinate system to the vehicle coordinate system, and performs the data fusion by serially combining the data individually obtained by the various sensors in time series according to the first come first processed principle. The matching speed can be improved by using data of various sensors.
According to one embodiment, when tracking the matched obstacle and trajectory, if the trajectory is not updated for a certain number of frames, the trajectory tracking unit 300 does not track the trajectory.
According to one embodiment, the predetermined number of frames is from 5 frames to 7 frames.
According to one embodiment, the trajectory tracking unit 300 employs kalman filtering for trajectory updating as follows:
step A: predicting the position of the next moment according to the position of the current moment:
in the formula (I), the compound is shown in the specification,is an estimate based on the state at time k-1, Xk-1The state at the moment k-1; u. ofkA controller vector at time k; a is a state transition matrix; b is a control matrix acting on the controller vector;
and B: predicting the covariance matrix at the next moment according to the current covariance matrix
In the formula (I), the compound is shown in the specification,based on the covariance matrix of the state estimates at time k-1 to time k,
P k -1the covariance matrix of the state at the moment of k-1 is obtained, and Q is the covariance matrix of the process noise;
and C: calculating a Kalman gain according to the predicted covariance matrix at the next moment:
in the formula (I), the compound is shown in the specification,K k the Kalman gain at the moment k is obtained, H is an observation matrix for mapping the real state space to an observation space, and R is a covariance matrix of observation noise;
step D: and updating the predicted position and covariance matrix at the next moment according to the Kalman gain:
updating the estimator:
updating the covariance matrix:
wherein z iskIs a state measurement value at the time k; i is an identity matrix and is a matrix of the identity,
the estimator update and the covariance matrix update are completed, i.e., the trajectory update is completed.
Those skilled in the art will readily appreciate that the illustrations and descriptions of methods above may be utilized to understand and implement the corresponding parts of the apparatus of the present invention. Sometimes update and trace may mean the same meaning, depending on the context.
Those skilled in the art will readily appreciate that the method of the present invention may also include other steps corresponding to the functions performed by the apparatus of the present invention. The above steps may also be simplified.
The numbering of the elements and steps of the present invention is for convenience of description only and does not indicate the order of execution unless otherwise indicated in the context.
Those skilled in the art will appreciate that the above units can be implemented by software or special hardware, such as a field programmable gate array, a single chip, or a microchip, or by a combination of software and hardware.
The present invention also provides an electronic device, comprising: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the method of the present invention.
The invention also relates to a computer software which, when executed by a computing device (such as a single-chip microcomputer, a computer, a CPU, etc.), can implement the method of the invention.
The present invention also relates to a computer software storage device, such as a hard disk, a floppy disk, a flash memory, etc., which stores the above computer software.
The description of the method or steps of the invention may be used for understanding the description of the unit or device, and the description of the unit or device may be used for understanding the method or steps of the invention.
The above description is intended to be illustrative, and not restrictive, and any changes and substitutions that come within the spirit of the invention are desired to be protected.
Claims (10)
1. A target tracking method, comprising the steps of:
obtaining obstacle information from each frame of sensor data from a plurality of sensors;
carrying out track matching according to the obstacle information;
performing track tracking and updating on the matched obstacles and tracks,
wherein performing trajectory matching based on the obstacle information comprises:
determining whether the obstacle was previously present;
when it is determined that the obstacle exists before, determining whether the obstacle exists in a predetermined number of previous sensor data frames;
and in the case that the sensor data frames of the preset number exist, judging whether the distance between the obstacle and the track formed by the sensor data frames of the preset number is smaller than the preset distance, and when the distance is judged to be smaller than the preset distance, confirming that the obstacle is matched with the track formed by the sensor data frames of the preset number.
2. The target tracking method of claim 1, wherein determining whether an obstacle was previously present is performed by matching a list of existing obstacles with obstacles found in the sensor data frame using the Hungarian algorithm.
3. The object tracking method according to claim 1, wherein the predetermined number of sensor data frames corresponds to a time less than half of a shortest sliding distance of an emergency brake of the vehicle, the predetermined number is greater than 8, and the predetermined distance is between 0.9m and 1.1 m.
4. The method of claim 1, wherein the obstacle information is obtained from each sensor data frame by spatially synchronizing and data fusing the sensor data frames from the plurality of sensors as follows:
the spatial synchronization is completed by unifying all data measured by various sensors by taking the sensors as coordinate systems to a vehicle coordinate system;
the data fusion is completed by serially combining the data respectively obtained by the various sensors according to the principle of first-come first-serve processing and the time sequence, wherein the various sensors comprise a millimeter wave radar, a camera, a laser radar and an ultrasonic radar.
5. The target tracking method of claim 1, wherein the trajectory update is performed using kalman filtering as follows:
step A: predicting the position of the next moment according to the position of the current moment:
in the formula (I), the compound is shown in the specification,is an estimate based on the state at time k-1, Xk-1The state at the moment k-1; u. ofkA controller vector at time k; a is a state transition matrix; b is a control matrix acting on the controller vector;
and B: predicting the covariance matrix at the next moment according to the current covariance matrix
In the formula (I), the compound is shown in the specification,based on the covariance matrix of the state estimates at time k-1 to time k,
P k -1the covariance matrix of the state at the moment of k-1 is obtained, and Q is the covariance matrix of the process noise;
and C: calculating a Kalman gain according to the predicted covariance matrix at the next moment:
in the formula (I), the compound is shown in the specification,K k the Kalman gain at the moment k is obtained, H is an observation matrix for mapping the real state space to an observation space, and R is a covariance matrix of observation noise;
step D: and updating the predicted position and covariance matrix at the next moment according to the Kalman gain:
updating the estimator:
updating the covariance matrix:
wherein the content of the first and second substances,is a state measurement value at the time k; i is an identity matrix and is a matrix of the identity,
the estimator update and the covariance matrix update are completed, i.e., the trajectory update is completed.
6. The object tracking method according to claim 1, wherein when the trajectory of the matched obstacle and trajectory is tracked, if a trajectory is not updated for more than a predetermined number of frames, the trajectory is not tracked.
7. The object tracking method of claim 6, wherein the predetermined number of frames is 5 to 7 frames.
8. An object tracking apparatus, comprising
An obstacle information acquisition unit that acquires obstacle information from each sensor data frame from the plurality of sensors;
the track matching unit is used for carrying out track matching according to the obstacle information;
a trajectory tracking unit for performing trajectory tracking and updating on the matched obstacle and trajectory,
wherein the trajectory matching unit includes:
a presence determination unit that determines whether the obstacle has existed before;
a continuous presence determination unit that determines whether the obstacle is present in a predetermined number of sensor data frames before when it is determined that the obstacle is present before;
a distance determination unit that determines whether a distance between the obstacle and a trajectory formed by a predetermined number of sensor data frames is smaller than a predetermined distance, in a case where it is determined that the predetermined number of sensor data frames have existed before; and
and a matching determination unit that confirms that the obstacle matches a trajectory formed by the predetermined number of sensor data frames when the distance judgment unit judges that the distance is smaller than the predetermined distance.
9. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method of any one of claims 1 to 7.
10. A computer-readable storage medium, on which a device control program is stored, which, when executed by a processor, implements the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210127985.9A CN114170274B (en) | 2022-02-11 | 2022-02-11 | Target tracking method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210127985.9A CN114170274B (en) | 2022-02-11 | 2022-02-11 | Target tracking method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114170274A true CN114170274A (en) | 2022-03-11 |
CN114170274B CN114170274B (en) | 2022-06-14 |
Family
ID=80489769
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210127985.9A Active CN114170274B (en) | 2022-02-11 | 2022-02-11 | Target tracking method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114170274B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114463388A (en) * | 2022-04-13 | 2022-05-10 | 北京中科慧眼科技有限公司 | Binocular camera-based height limiting device detection method and system and intelligent terminal |
CN114779271A (en) * | 2022-06-16 | 2022-07-22 | 杭州宏景智驾科技有限公司 | Target detection method and device, electronic equipment and storage medium |
CN115214719A (en) * | 2022-05-30 | 2022-10-21 | 广州汽车集团股份有限公司 | Obstacle trajectory tracking method and device, intelligent driving equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160063330A1 (en) * | 2014-09-03 | 2016-03-03 | Sharp Laboratories Of America, Inc. | Methods and Systems for Vision-Based Motion Estimation |
CN111932580A (en) * | 2020-07-03 | 2020-11-13 | 江苏大学 | Road 3D vehicle tracking method and system based on Kalman filtering and Hungary algorithm |
CN112285714A (en) * | 2020-09-08 | 2021-01-29 | 苏州挚途科技有限公司 | Obstacle speed fusion method and device based on multiple sensors |
CN113537287A (en) * | 2021-06-11 | 2021-10-22 | 北京汽车研究总院有限公司 | Multi-sensor information fusion method and device, storage medium and automatic driving system |
-
2022
- 2022-02-11 CN CN202210127985.9A patent/CN114170274B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160063330A1 (en) * | 2014-09-03 | 2016-03-03 | Sharp Laboratories Of America, Inc. | Methods and Systems for Vision-Based Motion Estimation |
CN111932580A (en) * | 2020-07-03 | 2020-11-13 | 江苏大学 | Road 3D vehicle tracking method and system based on Kalman filtering and Hungary algorithm |
CN112285714A (en) * | 2020-09-08 | 2021-01-29 | 苏州挚途科技有限公司 | Obstacle speed fusion method and device based on multiple sensors |
CN113537287A (en) * | 2021-06-11 | 2021-10-22 | 北京汽车研究总院有限公司 | Multi-sensor information fusion method and device, storage medium and automatic driving system |
Non-Patent Citations (1)
Title |
---|
胡随芯 等: "基于多特征融合的高速路车辆多目标跟踪算法研究", 《汽车技术》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114463388A (en) * | 2022-04-13 | 2022-05-10 | 北京中科慧眼科技有限公司 | Binocular camera-based height limiting device detection method and system and intelligent terminal |
CN115214719A (en) * | 2022-05-30 | 2022-10-21 | 广州汽车集团股份有限公司 | Obstacle trajectory tracking method and device, intelligent driving equipment and storage medium |
CN115214719B (en) * | 2022-05-30 | 2024-04-05 | 广州汽车集团股份有限公司 | Obstacle track tracking method and device, intelligent driving equipment and storage medium |
CN114779271A (en) * | 2022-06-16 | 2022-07-22 | 杭州宏景智驾科技有限公司 | Target detection method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114170274B (en) | 2022-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114170274B (en) | Target tracking method and device, electronic equipment and storage medium | |
JP7140922B2 (en) | Multi-sensor data fusion method and apparatus | |
CN109521756B (en) | Obstacle motion information generation method and apparatus for unmanned vehicle | |
JP7130505B2 (en) | Driving lane identification without road curvature data | |
JP6666075B2 (en) | Method and apparatus for determining a lane identifier of a road | |
CN1940591B (en) | System and method of target tracking using sensor fusion | |
JP5162849B2 (en) | Fixed point position recorder | |
CN110596694A (en) | Complex environment radar multi-target tracking and road running environment prediction method | |
US11520340B2 (en) | Traffic lane information management method, running control method, and traffic lane information management device | |
US20180095103A1 (en) | State calculation apparatus, state calculation method, and recording medium storing program for moving object | |
CN113492851B (en) | Vehicle control device, vehicle control method, and computer program for vehicle control | |
CN110632617B (en) | Laser radar point cloud data processing method and device | |
US10839263B2 (en) | System and method for evaluating a trained vehicle data set familiarity of a driver assitance system | |
CN107103275B (en) | Wheel-based vehicle detection and tracking using radar and vision | |
US11703335B2 (en) | Coordinating and learning maps dynamically | |
CN113228040A (en) | Multi-level object heading estimation | |
CN113848545B (en) | Fusion target detection and tracking method based on vision and millimeter wave radar | |
CN110307841B (en) | Vehicle motion parameter estimation method based on incomplete information measurement | |
CN113665570A (en) | Method and device for automatically sensing driving signal and vehicle | |
Lee et al. | A geometric model based 2D LiDAR/radar sensor fusion for tracking surrounding vehicles | |
CN113933858A (en) | Abnormal detection method and device of positioning sensor and terminal equipment | |
US11420632B2 (en) | Output device, control method, program and storage medium | |
CN115856872A (en) | Vehicle motion track continuous tracking method | |
JP6920342B2 (en) | Devices and methods for determining object kinematics for movable objects | |
CN114049767B (en) | Edge computing method and device and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |