CN111260683A - Target detection and tracking method and device for three-dimensional point cloud data - Google Patents
Target detection and tracking method and device for three-dimensional point cloud data Download PDFInfo
- Publication number
- CN111260683A CN111260683A CN202010020376.4A CN202010020376A CN111260683A CN 111260683 A CN111260683 A CN 111260683A CN 202010020376 A CN202010020376 A CN 202010020376A CN 111260683 A CN111260683 A CN 111260683A
- Authority
- CN
- China
- Prior art keywords
- target
- point cloud
- point
- data
- ground
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/66—Tracking systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention discloses a target detection and tracking method of three-dimensional point cloud data, which comprises the following steps: reading original point cloud data from laser radar equipment, and carrying out ground plane point segmentation and extraction on the obtained original point cloud data to obtain a ground point PgAnd a non-ground point Png(ii) a Clustering non-ground points to generate independent target point cloud clusters; calculating in the point cloud cluster to obtain target data generated by clustering; by aiming at the generated target dataJudging to obtain a moving target and speed information of the target; obtaining prior estimation based on motion estimation of a uniform linear motion model; and updating the state of the prior estimation and the actual observation of the target to obtain target data with the updated state which is currently obtained in real time, namely the three-dimensional point cloud target matching. According to the invention, the adopted target tracking algorithm combines motion estimation and feature matching based on target information, and the accuracy of target tracking is improved by comparing the observed value of the current frame with the observed value of the current frame.
Description
Technical Field
The invention relates to the technical field of laser radar detection, in particular to a target detection and tracking method and device of three-dimensional point cloud data.
Background
In recent years, the automobile industry in China keeps a steady development trend all the time, the unmanned technology represents the latest development direction of the whole industry, and the technology aims to comprehensively improve the safety, comfort and stability of automobile driving and meet higher market demands.
With the continuous development of economic society and the continuous acceleration of urbanization pace, the traffic condition of urban roads is more and more complex, and the real-time pedestrian and vehicle detection has important significance in the field of unmanned driving. Such as: the running states of pedestrians and vehicles on two sides of the road can be detected, and therefore the control decision system is reasonably arranged to avoid and overtake the vehicle. In conclusion, the application of real-time and rapid target detection and tracking in the unmanned technology is very important, so the research of the invention is significant.
Target detection and tracking is largely classified into an image-based method and a radar-data-based method according to the data source. However, the target detection and tracking based on the sequence images has the outstanding problems that the data size of image processing is large, the real-time performance of the algorithm is poor, and the depth information of the target is difficult to obtain.
In order to solve the above problems, in recent years, laser radar has been widely used in the field of target detection and tracking due to its advantages of small data volume, high ranging accuracy, good real-time performance, and the like. Therefore, a target detection and tracking method of three-dimensional point cloud data is provided.
Disclosure of Invention
The invention aims to provide a target detection and tracking method and a target detection and tracking device for three-dimensional point cloud data, so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme:
a target detection and tracking method of three-dimensional point cloud data comprises the following steps:
Step 2, clustering non-ground points to generate independent target point cloud clusters;
step 3, obtaining target data generated by clustering through calculation in the point cloud cluster;
step 4, judging the generated target data to obtain a moving target and speed information of the target;
step 5, obtaining prior estimation based on motion estimation of the uniform linear motion model;
and 6, updating the state of the prior estimation and actual observation of the target to obtain target data with the updated state which is currently obtained in real time, namely matching the three-dimensional point cloud target.
Preferably, the segmentation and extraction of the ground plane points adopt a multi-plane fitting method for determining iteration times, and the steps are as follows:
step 1.1, for the estimation of the plane, a linear model is adopted: ax + by + cz + d is 0, and a plane is fitted by using a least square method;
step 1.2, selecting a point with h being 0 as an initial ground point P for the initial point cloud datag;
Step 1.3, obtaining linear model ground of ground plane ═ EstimatePlane (P)g);
Step 1.4, traversing the input point cloud P, and regarding the point P in the input point cloud PkCalculating the distance from the ground plane
Step 1.5, setting up threshold distance Th of ground planedistJudgment of hdist>ThdistIf true, p iskEntry into non-ground point Png(ii) a Otherwise, p iskEntry into ground point Pg;
Step 1.6, establishing iteration times NiterFrom step 1.3 toStep 1.5 Loop Calculations until the number of iterations N is reachediter;
The method consists in classifying the input point cloud P as a ground point PgAnd a non-ground point Png。
Preferably, the clustering processing steps of the non-ground points are as follows:
step 2.1, Point cloud PngIn a counter-clockwise manner starting from the top scanline, forming runs of scanlines and each scanline being assigned its own new label;
step 2.2, for each point in the scan line, if the distance d from the rear contact is greater than the merging threshold ThmergeIf so, the rear contact allocates a new label to form a new run;
and 2.3, then, the runs of the first scanning line becomes runsAvove for propagating the label thereof to the runs of the subsequent scanning line, and when the distance d between the new runs point and the nearest neighbor point in the scanning line is smaller than the merging threshold ThmergeWhen the label of runsAvel is propagated to a new run, when the point in the same run has the nearest neighbors of a plurality of different inheritable label labels, the label obtained by the same run is the smallest value;
step 2.4, when the point in runs can not find the proper adjacent point which contains the label recently, the point becomes a new runsAvove;
and 2.5, executing the runsAvove transfer for the second time to finally update the point label.
Preferably, the target data generated by obtaining the clusters is:
the same point of each label is the acquired target point cloud cluster, the self position of the laser radar is used as the origin, the same three-dimensional rectangular coordinate system is established for each frame of point cloud, and the point in each point cloud cluster is subjected toCalculating its center point Pcenter=average(Pi) Obtaining the central position [ X ]center,Ycenter,Zcenter](ii) a Using an envelope method to obtain pointsExtreme value of distance between [ Delta X ]MAX,ΔYMAX,ΔZMAX]Obtaining a cuboid which can contain all the points in the point cloud cluster to obtain the size information of the target, length, width and height [ l, m, n]。
Preferably, the moving object discrimination step is as follows:
step 4.1, assigning a unique ID value to each target for target data generated by clustering in the current frame;
step 4.2, comparing the target data of the two frames before and after,whereinRepresents the distance between the ith target of the k-1 th frame and the center of the jth target of the k-1 th frame; threshold (Threshold)dThe central position is a threshold value of the change of the central position, and if two target data between adjacent frames meet the formula, the two target data are determined to be the same moving target; if the ID values of the two targets are different, the target ID value of the previous frame is given to the target of the next frame;
4.3, if the target with the same ID has more than 3 frames of target data, 1 can determine that the target is a moving target, the scanning frequency of the laser radar is 10Hz, and the time interval of the point cloud data of the adjacent frames is 0.1 s; obtaining velocity information v of the target using the position information of the target dataxAnd vy。
Preferably, the motion estimation step based on the uniform linear motion model is as follows:
step 5.1, establishing a state vector Xk=[x,y,l,m,n,vx,vy]TX and y denote the x and y coordinates of the center point of the moving object, vxAnd vyThe speed of the target in the X direction and the y direction is represented, l, m and n represent the length, width and height of a cuboid circumscribed to the moving target, and the target cannot move in the z direction in an experimental scene, so the state vector XkIn which z and v are eliminatedz,
Step 5.2, State prediction equationWhereinThe state predicting part is responsible for utilizing the current state Xk-1Estimating the state of the next timeA priori estimate is obtained.
Preferably, the three-dimensional point cloud target matching step is as follows:
step 6.1, comparing the target data of the two frames before and after,whereinRepresenting the distance between the prior estimation value of the ith target and the actual observation value center of the jth target in the kth frame point cloud;reflecting the similarity of the sizes of the two target volumes for the absolute value of the difference between the two target volumes; threshold (Threshold)dAnd ThresholdvThreshold values for the change in the center position and the change in the volume size, respectively;
step 6.2, if the matching conditions are met, updating the equation according to the prior estimation and the actual observation of the target and the state:fusing to obtain updated state and observation vector Zk=[x,yl,m,n,vx,vy],TAnd obtaining the moving object data for the current frame in real time.
In order to achieve the purpose, the invention also provides a target detection and tracking device of the three-dimensional point cloud data, which can quickly and accurately realize the detection and classification of the target.
An apparatus for detecting and tracking a target of three-dimensional point cloud data, comprising:
a scanning unit for reading the original point cloud data from the laser radar device,
the filtering unit is used for filtering clutter signals in the original point cloud data;
the point cloud data processing unit processes the filtered original point cloud data according to the following steps:
Step 2, clustering non-ground points to generate independent target point cloud clusters;
step 3, obtaining target data generated by clustering through calculation in the point cloud cluster;
the target matching unit processes the target data generated by the acquired clusters according to the following steps:
step 4, judging the generated target data to obtain a moving target and speed information of the target;
step 5, obtaining prior estimation based on motion estimation of the uniform linear motion model;
and the mode identification unit updates the state of the prior estimation and the actual observation of the target to obtain the updated state as the current real-time obtained target data, namely the three-dimensional point cloud target matching.
Preferably, the filtering unit is based on a kalman filter.
Compared with the prior art, the invention has the beneficial effects that:
1. the invention accepts the assumption that the point of the point cloud with the lowest height value most probably belongs to the ground, and the prior knowledge provides a basis for the initial ground point of the fitting plane; compared with a typical plane fitting technology, such as randomly selecting initial ground points in RANSAC, the method has a faster convergence effect; updating ground points by calculating whether the projection height of the points on the ground is within a threshold distance; and performing repeated iterative fitting on the ground, and improving the adaptability and accuracy of the ground model.
2. The method uses the scanning lines to traverse non-ground points, and divides different point-dividing point clusters through the threshold value of Euclidean distance, thereby gradually realizing the cluster fusion between points and scanning lines; the point vector generated by the scanning line is used as the point cloud cluster, and the searching speed in the clustering process is accelerated relative to the original KD-tree index.
3. According to the method, the moving speed of the target is calculated and obtained through target position information in the front frame and the rear frame of the point cloud sequence, and meanwhile, the moving target is judged; taking the position, the size and the speed of a target as a state vector of a Kalman filter, and establishing a motion model of uniform linear motion; meanwhile, the inter-frame time interval of the point cloud sequence is small, the motion state of the target has continuity, and the accuracy of motion estimation can be guaranteed.
4. The target tracking algorithm adopted by the invention combines motion estimation and feature matching based on target information, and improves the accuracy of target tracking by comparing the observed value of the current frame.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
fig. 2 is a schematic block diagram of the apparatus of the present invention.
Fig. 3 is a diagram of the actual effect of the target detection and tracking algorithm of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 to 3, the present invention provides a technical solution:
a target detection and tracking method of three-dimensional point cloud data is characterized in that original point cloud data are read from laser radar equipment, and ground plane fitting and segmentation are carried out on the obtained original point cloud data; traversing non-ground points by using a scanning line, and clustering by using a threshold value of Euclidean distance; firstly, estimating the size of the generated target, secondly, retrieving the scanning line in the target and calculating the curvature value of the scanning line, and then comparing the two characteristics with the prior value to judge whether the difference value is in the threshold range or not to perform target classification work. Specifically, the method comprises the following steps:
a multi-plane fitting method for determining iteration times is provided, which is used for rapidly extracting ground points.
The problem that the density of original point cloud data obtained by scanning of a laser radar is overlarge is possibly existed, and downsampling processing can be performed in advance, so that the running time of an algorithm is reduced.
Step 1.1, for the estimation of the plane, we use a simple linear model: ax + by + cz is 0, a least square method is used for fitting a plane, and an implementation function of the method can be found in a PCL library;
step 1.2, selecting a point with h being 0 as an initial ground point P for the initial point cloud datagObtaining the ground plane linear model ground ═ EstimatePlane (P)g);
Step 1.3, traversing the input point cloud P, and regarding the point P in the input point cloud PkCalculating the distance from the ground plane
Step 1.4, setting up threshold distance Th of ground planedistJudgment of hdist>ThdistIf true, p iskEntry into non-ground point Png(ii) a Otherwise, p iskEntry into ground point Pg;
Step 1.5, establishing iteration times NiterThe loop calculation from step 1.2 to step 1.4 is performed until the number of iterations N is reachediter;
The method comprises P-dividing the input point cloudClass P being ground pointgAnd a non-ground point Png。
Step 2, clustering processing of non-ground points:
remaining points P not belonging to the groundngIn the method, clustering is required to generate independent target point cloud clusters. Our goal is for each pointIn (3), acquiring a label representing the cluster identity of the label. A layer of points generated from the same LiDAR ring is referred to as a scanline. In each scan line, the dots constitute a dot vector (vector) called run. Points in runs share the same label.
Traversing non-ground points P by scan linesngRun is generated from the scan lines. vector is the content in the C + + standard template library, and Chinese is occasionally translated as a "container", which is a multifunctional template class and function library capable of operating various data structures and algorithms. A vector is considered a container because it can hold various types of objects like a container, and simply, a vector is a dynamic array that can hold any type of data, and can add and compress data.
Step 2.1, Point cloud PngIn a counter-clockwise manner starting from the top scanline, forming runs of scanlines and each scanline being assigned its own new label;
step 2.2, for each point in the scan line, if the distance d from the rear contact is greater than the merging threshold ThmergeIf so, the rear contact allocates a new label to form a new run;
step 2.3, then, the runs of the first scan line becomes runsAvove for propagating its label to the runs in the subsequent scan line. When the distance d between the new run point and the nearest neighbor point in the scanning line is less than the merging threshold ThmergeAt this time, the tag label of runsbove is propagated to a new runs. When a point in the same run has the nearest neighbors of a plurality of different inheritable labels, the label obtained by the point is the smallest in value;
step 2.4, when the point in runs can not find the proper adjacent point which contains the label recently, the point becomes a new runsAvove;
and 2.5, executing the runsAvove transfer for the second time to finally update the point label.
Step 3, acquiring target data generated by clustering:
and (4) obtaining the point with the same label, namely the obtained target point cloud cluster. For each point in the point cloud clusterCalculating its center point Pcenter=average(Pi) To find the center position [ X ]center,Ycenter,Zcenter](ii) a Traversing points p in a point cloud clusterkObtaining an extreme value [ Delta X ] of the distance between pointsMAX,ΔYMAX,ΔZMAX]。
And through the distance extreme value, an external cuboid can be generated for the clustered target, and the volume of the target is approximately obtained. And setting a threshold range, eliminating targets with overlarge or undersize volumes (large-area wall bodies or point cloud noise), and reducing the running time of a target classification algorithm.
Step 4, distinguishing a moving target:
and judging the generated target data to obtain the moving target and the speed information of the target.
And 4.1, endowing each target with a unique ID value for the target data generated by clustering in the current frame.
Step 4.2, comparing the target data of the two frames before and after,whereinRepresents the distance between the ith target of the k-1 th frame and the center of the jth target of the k-1 th frame; threshold (Threshold)dIs the threshold for the change in the center position. If the two target data between adjacent frames satisfy the above formula, the two target data are determined to be the sameA moving object; if the ID values of the two targets are different, the target ID value of the previous frame is assigned to the target of the next frame.
And 4.3, if the target with the same ID has more than 3 frames of target data, 1 can determine that the target is a moving target. The scanning frequency of the laser radar is 10Hz, so the time interval of point cloud data of adjacent frames is 0.1 s; obtaining velocity information v of the target using the position information of the target dataxAnd vy。
And 5, motion estimation based on a uniform linear motion model:
step 5.1, establishing a state vector Xk=[x,y,l,m,n,vx,vy]TX and y denote the x and y coordinates of the center point of the moving object, vxAnd vyThe speed of the target in the x direction and the y direction is shown, and l, m and n represent the length, width and height of a circumscribed cuboid of the moving target. The target does not move in the z direction in the experimental scenario, so the state vector XkIn which z and v are eliminatedz。
Step 5.2, State prediction equationWhereinThe state predicting part is responsible for utilizing the current state Xk-1Estimating the state of the next timeObtaining prior estimation;
step 6, three-dimensional point cloud target matching:
the time interval between two adjacent frames of the point cloud sequence is small, the motion state of the target cannot be suddenly changed, and the motion is continuous. The same moving target can be considered, the prior estimation obtained by the state prediction calculation of the state at the previous moment is not changed much compared with the actual observation at the moment.
Step 6.1, comparing the target data of the two frames before and after,whereinRepresenting the distance between the prior estimation value of the ith target and the actual observation value center of the jth target in the kth frame point cloud;reflecting the similarity of the sizes of the two target volumes for the absolute value of the difference between the two target volumes; threshold (Threshold)dAnd ThresholdvRespectively, the central position change and the volume size change.
Step 6.2, if the matching conditions are met, updating the equation according to the prior estimation and the actual observation of the target and the state:and fusing to obtain an updated state. Observation vectorTarget data obtained in real time for the current frame.
In this embodiment, a target detection and tracking apparatus for three-dimensional point cloud data can rapidly and accurately detect and classify a target, including:
a scanning unit for reading the original point cloud data from the laser radar device,
the filtering unit is used for filtering clutter signals in the original point cloud data;
the point cloud data processing unit processes the filtered original point cloud data according to the following steps:
Step 2, clustering non-ground points to generate independent target point cloud clusters;
step 3, obtaining target data generated by clustering through calculation in the point cloud cluster;
the target matching unit processes the target data generated by the acquired clusters according to the following steps:
step 4, judging the generated target data to obtain a moving target and speed information of the target;
step 5, obtaining prior estimation based on motion estimation of the uniform linear motion model;
and the mode identification unit updates the state of the prior estimation and the actual observation of the target to obtain the updated state as the current real-time obtained target data, namely the three-dimensional point cloud target matching.
The processing method has the advantages of small data volume, high processing speed and high target classification accuracy. The target can be detected and tracked quickly and accurately, and therefore the unmanned system is well helped to make control decisions.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (9)
1. A target detection and tracking method of three-dimensional point cloud data is characterized by comprising the following steps:
step 1, reading original point cloud data from laser radar equipment, and carrying out ground plane point segmentation and extraction on the obtained original point cloud data to obtain a ground point PgAnd a non-ground point Png;
Step 2, clustering non-ground points to generate independent target point cloud clusters;
step 3, obtaining target data generated by clustering through calculation in the point cloud cluster;
step 4, judging the generated target data to obtain a moving target and speed information of the target;
step 5, obtaining prior estimation based on motion estimation of the uniform linear motion model;
and 6, updating the state of the prior estimation and actual observation of the target to obtain target data with the updated state which is currently obtained in real time, namely matching the three-dimensional point cloud target.
2. The method for detecting and tracking the target of the three-dimensional point cloud data according to claim 1, wherein the segmentation and extraction of the ground plane points adopts a multi-plane fitting method for determining the number of iterations, and the steps are as follows:
step 1.1, for the estimation of the plane, a linear model is adopted: ax + by + cz + d is 0, and a plane is fitted by using a least square method;
step 1.2, selecting a point with h being 0 as an initial ground point P for the initial point cloud datag;
Step 1.3, obtaining linear model ground of ground plane ═ EstimatePlane (P)g);
Step 1.4, traversing the input point cloud P, and regarding the point P in the input point cloud PkCalculating the distance from the ground plane
Step 1.5, setting up threshold distance Th of ground planedistJudgment of hdist>ThdistIf true, p iskEntry into non-ground point Png(ii) a Otherwise, p iskEntry into ground point Pg;
Step 1.6, establishing iteration times NiterThe loop calculation from step 1.3 to step 1.5 is performed until the number of iterations N is reachediter;
The method consists in classifying the input point cloud P as a ground point PgAnd a non-ground point Png。
3. The method of claim 2, wherein the non-ground point clustering comprises the following steps:
step 2.1, Point cloud PngIn a counter-clockwise manner starting from the top scanline, forming runs of scanlines and each scanline being assigned its own new label;
step 2.2, for each point in the scan line, if the distance d from the rear contact is greater than the merging threshold ThmergeIf so, the rear contact allocates a new label to form a new run;
and 2.3, then, the runs of the first scanning line becomes runsAvove for propagating the label thereof to the runs of the subsequent scanning line, and when the distance d between the new runs point and the nearest neighbor point in the scanning line is smaller than the merging threshold ThmergeWhen the label of runsAvel is propagated to a new run, when the point in the same run has the nearest neighbors of a plurality of different inheritable label labels, the label obtained by the same run is the smallest value;
step 2.4, when the point in runs can not find the proper adjacent point which contains the label recently, the point becomes a new runsAvove;
and 2.5, executing the runsAvove transfer for the second time to finally update the point label.
4. The method for detecting and tracking the target of the three-dimensional point cloud data according to claim 3, wherein the step of obtaining the target data generated by clustering comprises the following steps:
the same point of each label is the acquired target point cloud cluster, the self position of the laser radar is used as the origin, the same three-dimensional rectangular coordinate system is established for each frame of point cloud, and the point in each point cloud cluster is subjected toCalculating its center point Pcenter=average(Pi) Obtaining the central position [ X ]center,Ycenter,Zcenter](ii) a Obtaining distance extreme value [ Delta X ] between points by using envelope methodMAX,ΔYMAX,ΔZMAX]To obtain oneThe cuboid contained in all the points in the point cloud cluster can be used for obtaining the size information of the target, and the length, the width and the height [ l, m, n ]]。
5. The method for detecting and tracking the target of the three-dimensional point cloud data according to claim 1, wherein the moving target is determined by the following steps:
step 4.1, assigning a unique ID value to each target for target data generated by clustering in the current frame;
step 4.2, comparing the target data of the two frames before and after,whereinRepresents the distance between the ith target of the k-1 th frame and the center of the jth target of the k-1 th frame; threshold (Threshold)dThe central position is a threshold value of the change of the central position, and if two target data between adjacent frames meet the formula, the two target data are determined to be the same moving target; if the ID values of the two targets are different, the target ID value of the previous frame is given to the target of the next frame;
4.3, if the target with the same ID has more than 3 frames of target data, 1 can determine that the target is a moving target, the scanning frequency of the laser radar is 10Hz, and the time interval of the point cloud data of the adjacent frames is 0.1 s; obtaining velocity information v of the target using the position information of the target dataxAnd vy。
6. The method for detecting and tracking the target of the three-dimensional point cloud data according to claim 1, wherein the motion estimation based on the uniform linear motion model comprises the following steps:
step 5.1, establishing a state vector Xk=[x,y,l,m,n,vx,vy]TX and y denote the x and y coordinates of the center point of the moving object, vxAnd vyRepresenting the speed of the target in the x direction and the y direction, l, m and n representing the length of a circumscribed cuboid of the moving target,wide, high, no motion of the target in the z direction in the experimental scene, so the state vector XkIn which z and v are eliminatedz,
7. The method of claim 6, wherein the three-dimensional point cloud target matching comprises:
step 6.1, comparing the target data of the two frames before and after,whereinRepresenting the distance between the prior estimation value of the ith target and the actual observation value center of the jth target in the kth frame point cloud;reflecting the similarity of the sizes of the two target volumes for the absolute value of the difference between the two target volumes; threshold (Threshold)dAnd ThresholdvThreshold values for the change in the center position and the change in the volume size, respectively;
step 6.2, if the matching conditions are met, updating the equation according to the prior estimation and the actual observation of the target and the state:fusing to obtain updated state and observation vector Zk=[x,y,l,m,n,vx,vy]TAnd obtaining the moving object data for the current frame in real time.
8. A target detection and tracking device of three-dimensional point cloud data is characterized by comprising:
a scanning unit for reading the original point cloud data from the laser radar device,
the filtering unit is used for filtering clutter signals in the original point cloud data;
the point cloud data processing unit processes the filtered original point cloud data according to the following steps:
step 1, reading original point cloud data from laser radar equipment, and carrying out ground plane point segmentation and extraction on the obtained original point cloud data to obtain a ground point PgAnd a non-ground point Png;
Step 2, clustering non-ground points to generate independent target point cloud clusters;
step 3, obtaining target data generated by clustering through calculation in the point cloud cluster;
the target matching unit processes the target data generated by the acquired clusters according to the following steps:
step 4, judging the generated target data to obtain a moving target and speed information of the target;
step 5, obtaining prior estimation based on motion estimation of the uniform linear motion model;
and the mode identification unit updates the state of the prior estimation and the actual observation of the target to obtain the updated state as the current real-time obtained target data, namely the three-dimensional point cloud target matching.
9. The method of claim 8, wherein the filtering unit is based on a Kalman filter.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010020376.4A CN111260683B (en) | 2020-01-09 | 2020-01-09 | Target detection and tracking method and device for three-dimensional point cloud data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010020376.4A CN111260683B (en) | 2020-01-09 | 2020-01-09 | Target detection and tracking method and device for three-dimensional point cloud data |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111260683A true CN111260683A (en) | 2020-06-09 |
CN111260683B CN111260683B (en) | 2023-08-08 |
Family
ID=70948590
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010020376.4A Active CN111260683B (en) | 2020-01-09 | 2020-01-09 | Target detection and tracking method and device for three-dimensional point cloud data |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111260683B (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111766608A (en) * | 2020-06-12 | 2020-10-13 | 苏州泛像汽车技术有限公司 | Environmental perception system based on laser radar |
CN111781608A (en) * | 2020-07-03 | 2020-10-16 | 浙江光珀智能科技有限公司 | Moving target detection method and system based on FMCW laser radar |
CN111929676A (en) * | 2020-07-30 | 2020-11-13 | 上海交通大学 | X-band radar target detection and tracking method based on density clustering |
CN112462381A (en) * | 2020-11-19 | 2021-03-09 | 浙江吉利控股集团有限公司 | Multi-laser radar fusion method based on vehicle-road cooperation |
CN112669462A (en) * | 2021-01-18 | 2021-04-16 | 新拓三维技术(深圳)有限公司 | Model processing method and system suitable for scanning point cloud |
CN112762824A (en) * | 2020-12-24 | 2021-05-07 | 中南大学 | Unmanned vehicle positioning method and system |
CN112847343A (en) * | 2020-12-29 | 2021-05-28 | 深圳市普渡科技有限公司 | Dynamic target tracking and positioning method, device, equipment and storage medium |
CN112926514A (en) * | 2021-03-26 | 2021-06-08 | 哈尔滨工业大学(威海) | Multi-target detection and tracking method, system, storage medium and application |
CN112949566A (en) * | 2021-03-25 | 2021-06-11 | 浙江华是科技股份有限公司 | Monitoring method, device and system and computer storage medium |
CN113281718A (en) * | 2021-06-30 | 2021-08-20 | 江苏大学 | 3D multi-target tracking system and method based on laser radar scene flow estimation |
CN113343840A (en) * | 2021-06-02 | 2021-09-03 | 合肥泰瑞数创科技有限公司 | Object identification method and device based on three-dimensional point cloud |
CN113673105A (en) * | 2021-08-20 | 2021-11-19 | 安徽江淮汽车集团股份有限公司 | Design method of true value comparison strategy |
CN113721253A (en) * | 2021-08-30 | 2021-11-30 | 杭州视光半导体科技有限公司 | Moving object speed detection method based on FMCW laser radar |
CN113795771A (en) * | 2020-09-25 | 2021-12-14 | 深圳市大疆创新科技有限公司 | Method for estimating object speed by using point cloud radar, point cloud radar and system |
CN113932791A (en) * | 2021-09-15 | 2022-01-14 | 江苏徐工工程机械研究院有限公司 | Method and system for collecting map of loading and unloading operation area of strip mine area |
CN114111568A (en) * | 2021-09-30 | 2022-03-01 | 深圳市速腾聚创科技有限公司 | Method and device for determining appearance size of dynamic target, medium and electronic equipment |
CN114419152A (en) * | 2022-01-14 | 2022-04-29 | 中国农业大学 | Target detection and tracking method and system based on multi-dimensional point cloud characteristics |
CN114442101A (en) * | 2022-01-28 | 2022-05-06 | 南京慧尔视智能科技有限公司 | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar |
WO2022099530A1 (en) * | 2020-11-12 | 2022-05-19 | 深圳元戎启行科技有限公司 | Motion segmentation method and apparatus for point cloud data, computer device and storage medium |
CN114818916A (en) * | 2022-04-25 | 2022-07-29 | 电子科技大学 | Road target classification method based on millimeter wave radar multi-frame point cloud sequence |
CN115937320A (en) * | 2023-02-21 | 2023-04-07 | 深圳市华亿明投资发展有限公司 | Visual positioning method for polishing mobile phone shell |
CN116071694A (en) * | 2023-03-07 | 2023-05-05 | 浙江华是科技股份有限公司 | Ship detection method, device and computer readable storage medium |
CN117250610A (en) * | 2023-11-08 | 2023-12-19 | 浙江华是科技股份有限公司 | Laser radar-based intruder early warning method and system |
CN117872310A (en) * | 2024-03-08 | 2024-04-12 | 陕西欧卡电子智能科技有限公司 | Radar-based water surface target tracking method, device, equipment and medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103871100A (en) * | 2014-04-02 | 2014-06-18 | 中国科学院自动化研究所 | Method for rebuilding tree model based on point cloud and data driving |
US20180074203A1 (en) * | 2016-09-12 | 2018-03-15 | Delphi Technologies, Inc. | Lidar Object Detection System for Automated Vehicles |
CN110647835A (en) * | 2019-09-18 | 2020-01-03 | 合肥中科智驰科技有限公司 | Target detection and classification method and system based on 3D point cloud data |
-
2020
- 2020-01-09 CN CN202010020376.4A patent/CN111260683B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103871100A (en) * | 2014-04-02 | 2014-06-18 | 中国科学院自动化研究所 | Method for rebuilding tree model based on point cloud and data driving |
US20180074203A1 (en) * | 2016-09-12 | 2018-03-15 | Delphi Technologies, Inc. | Lidar Object Detection System for Automated Vehicles |
CN110647835A (en) * | 2019-09-18 | 2020-01-03 | 合肥中科智驰科技有限公司 | Target detection and classification method and system based on 3D point cloud data |
Non-Patent Citations (1)
Title |
---|
陆德彪等: "基于深度数据的车辆目标检测与跟踪方法", 《交通运输系统工程与信息》 * |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111766608A (en) * | 2020-06-12 | 2020-10-13 | 苏州泛像汽车技术有限公司 | Environmental perception system based on laser radar |
CN111781608A (en) * | 2020-07-03 | 2020-10-16 | 浙江光珀智能科技有限公司 | Moving target detection method and system based on FMCW laser radar |
CN111781608B (en) * | 2020-07-03 | 2023-04-25 | 浙江光珀智能科技有限公司 | Moving target detection method and system based on FMCW laser radar |
CN111929676A (en) * | 2020-07-30 | 2020-11-13 | 上海交通大学 | X-band radar target detection and tracking method based on density clustering |
CN111929676B (en) * | 2020-07-30 | 2022-07-08 | 上海交通大学 | X-band radar target detection and tracking method based on density clustering |
CN113795771A (en) * | 2020-09-25 | 2021-12-14 | 深圳市大疆创新科技有限公司 | Method for estimating object speed by using point cloud radar, point cloud radar and system |
WO2022061758A1 (en) * | 2020-09-25 | 2022-03-31 | 深圳市大疆创新科技有限公司 | Method for estimating speed of object using point cloud radar, point cloud radar, and system |
WO2022099530A1 (en) * | 2020-11-12 | 2022-05-19 | 深圳元戎启行科技有限公司 | Motion segmentation method and apparatus for point cloud data, computer device and storage medium |
CN112462381A (en) * | 2020-11-19 | 2021-03-09 | 浙江吉利控股集团有限公司 | Multi-laser radar fusion method based on vehicle-road cooperation |
CN112462381B (en) * | 2020-11-19 | 2024-06-04 | 浙江吉利控股集团有限公司 | Multi-laser radar fusion method based on vehicle-road cooperation |
CN112762824A (en) * | 2020-12-24 | 2021-05-07 | 中南大学 | Unmanned vehicle positioning method and system |
CN112762824B (en) * | 2020-12-24 | 2022-04-22 | 中南大学 | Unmanned vehicle positioning method and system |
CN112847343A (en) * | 2020-12-29 | 2021-05-28 | 深圳市普渡科技有限公司 | Dynamic target tracking and positioning method, device, equipment and storage medium |
WO2022142948A1 (en) * | 2020-12-29 | 2022-07-07 | 深圳市普渡科技有限公司 | Dynamic target tracking and positioning method and apparatus, and device and storage medium |
CN112669462A (en) * | 2021-01-18 | 2021-04-16 | 新拓三维技术(深圳)有限公司 | Model processing method and system suitable for scanning point cloud |
CN112949566A (en) * | 2021-03-25 | 2021-06-11 | 浙江华是科技股份有限公司 | Monitoring method, device and system and computer storage medium |
CN112926514A (en) * | 2021-03-26 | 2021-06-08 | 哈尔滨工业大学(威海) | Multi-target detection and tracking method, system, storage medium and application |
CN113343840B (en) * | 2021-06-02 | 2022-03-08 | 合肥泰瑞数创科技有限公司 | Object identification method and device based on three-dimensional point cloud |
CN113343840A (en) * | 2021-06-02 | 2021-09-03 | 合肥泰瑞数创科技有限公司 | Object identification method and device based on three-dimensional point cloud |
CN113281718B (en) * | 2021-06-30 | 2024-03-22 | 江苏大学 | 3D multi-target tracking system and method based on laser radar scene flow estimation |
CN113281718A (en) * | 2021-06-30 | 2021-08-20 | 江苏大学 | 3D multi-target tracking system and method based on laser radar scene flow estimation |
CN113673105A (en) * | 2021-08-20 | 2021-11-19 | 安徽江淮汽车集团股份有限公司 | Design method of true value comparison strategy |
CN113721253A (en) * | 2021-08-30 | 2021-11-30 | 杭州视光半导体科技有限公司 | Moving object speed detection method based on FMCW laser radar |
CN113721253B (en) * | 2021-08-30 | 2024-03-15 | 杭州视光半导体科技有限公司 | Moving object speed detection method based on FMCW laser radar |
CN113932791B (en) * | 2021-09-15 | 2023-11-07 | 江苏徐工工程机械研究院有限公司 | Method and system for collecting map of loading and unloading operation area of open-pit mining area |
CN113932791A (en) * | 2021-09-15 | 2022-01-14 | 江苏徐工工程机械研究院有限公司 | Method and system for collecting map of loading and unloading operation area of strip mine area |
CN114111568A (en) * | 2021-09-30 | 2022-03-01 | 深圳市速腾聚创科技有限公司 | Method and device for determining appearance size of dynamic target, medium and electronic equipment |
CN114419152A (en) * | 2022-01-14 | 2022-04-29 | 中国农业大学 | Target detection and tracking method and system based on multi-dimensional point cloud characteristics |
CN114419152B (en) * | 2022-01-14 | 2024-04-26 | 中国农业大学 | Target detection and tracking method and system based on multi-dimensional point cloud characteristics |
CN114442101B (en) * | 2022-01-28 | 2023-11-14 | 南京慧尔视智能科技有限公司 | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar |
CN114442101A (en) * | 2022-01-28 | 2022-05-06 | 南京慧尔视智能科技有限公司 | Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar |
CN114818916B (en) * | 2022-04-25 | 2023-04-07 | 电子科技大学 | Road target classification method based on millimeter wave radar multi-frame point cloud sequence |
CN114818916A (en) * | 2022-04-25 | 2022-07-29 | 电子科技大学 | Road target classification method based on millimeter wave radar multi-frame point cloud sequence |
CN115937320B (en) * | 2023-02-21 | 2023-05-05 | 深圳市华亿明投资发展有限公司 | Visual positioning method for polishing mobile phone shell |
CN115937320A (en) * | 2023-02-21 | 2023-04-07 | 深圳市华亿明投资发展有限公司 | Visual positioning method for polishing mobile phone shell |
CN116071694A (en) * | 2023-03-07 | 2023-05-05 | 浙江华是科技股份有限公司 | Ship detection method, device and computer readable storage medium |
CN117250610A (en) * | 2023-11-08 | 2023-12-19 | 浙江华是科技股份有限公司 | Laser radar-based intruder early warning method and system |
CN117250610B (en) * | 2023-11-08 | 2024-02-02 | 浙江华是科技股份有限公司 | Laser radar-based intruder early warning method and system |
CN117872310A (en) * | 2024-03-08 | 2024-04-12 | 陕西欧卡电子智能科技有限公司 | Radar-based water surface target tracking method, device, equipment and medium |
CN117872310B (en) * | 2024-03-08 | 2024-06-21 | 陕西欧卡电子智能科技有限公司 | Radar-based water surface target tracking method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN111260683B (en) | 2023-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111260683A (en) | Target detection and tracking method and device for three-dimensional point cloud data | |
CN111693972B (en) | Vehicle position and speed estimation method based on binocular sequence images | |
CN110647835B (en) | Target detection and classification method and system based on 3D point cloud data | |
CN113156421A (en) | Obstacle detection method based on information fusion of millimeter wave radar and camera | |
WO2022188663A1 (en) | Target detection method and apparatus | |
CN113506318B (en) | Three-dimensional target perception method under vehicle-mounted edge scene | |
US9213901B2 (en) | Robust and computationally efficient video-based object tracking in regularized motion environments | |
Liu et al. | A survey of vision-based vehicle detection and tracking techniques in ITS | |
CN112052802B (en) | Machine vision-based front vehicle behavior recognition method | |
CN113838089B (en) | Bubble track tracking method based on feature matching algorithm | |
CN104036523A (en) | Improved mean shift target tracking method based on surf features | |
CN107066968A (en) | The vehicle-mounted pedestrian detection method of convergence strategy based on target recognition and tracking | |
CN113848545B (en) | Fusion target detection and tracking method based on vision and millimeter wave radar | |
CN112666573B (en) | Detection method for retaining wall and barrier behind mine unloading area vehicle | |
CN115308732A (en) | Multi-target detection and tracking method integrating millimeter wave radar and depth vision | |
CN114358140A (en) | Rapid capturing method for sparse point cloud aircraft under low visibility | |
CN115861968A (en) | Dynamic obstacle removing method based on real-time point cloud data | |
CN116643291A (en) | SLAM method for removing dynamic targets by combining vision and laser radar | |
CN116385493A (en) | Multi-moving-object detection and track prediction method in field environment | |
Qing et al. | A novel particle filter implementation for a multiple-vehicle detection and tracking system using tail light segmentation | |
CN113689459B (en) | Real-time tracking and mapping method based on GMM and YOLO under dynamic environment | |
CN113221739A (en) | Monocular vision-based vehicle distance measuring method | |
CN116664851A (en) | Automatic driving data extraction method based on artificial intelligence | |
Qing et al. | Localization and tracking of same color vehicle under occlusion problem | |
CN113673383B (en) | Time-space domain obstacle detection method and system for complex road scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |