CN111337941A - Dynamic obstacle tracking method based on sparse laser radar data - Google Patents

Dynamic obstacle tracking method based on sparse laser radar data Download PDF

Info

Publication number
CN111337941A
CN111337941A CN202010192403.6A CN202010192403A CN111337941A CN 111337941 A CN111337941 A CN 111337941A CN 202010192403 A CN202010192403 A CN 202010192403A CN 111337941 A CN111337941 A CN 111337941A
Authority
CN
China
Prior art keywords
point
point cloud
points
track
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010192403.6A
Other languages
Chinese (zh)
Other versions
CN111337941B (en
Inventor
周增祥
孙翔峻
段仕鹏
左家乐
黎梦涛
柴源
王应富
刘志刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202010192403.6A priority Critical patent/CN111337941B/en
Publication of CN111337941A publication Critical patent/CN111337941A/en
Application granted granted Critical
Publication of CN111337941B publication Critical patent/CN111337941B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Abstract

The invention discloses a dynamic obstacle tracking method and a system based on sparse laser radar data, which belong to the technical field of dynamic obstacle tracking and comprise the following steps: s1: eliminating the static background; s2: dynamic point cloud clustering; s3: extracting the convex hull; s4: extracting characteristics; s5: data association; s6: and predicting the track. The method filters the static obstacle laser radar point cloud by using the grid map, and filters the extremely sparse point cloud by using the dbscan algorithm on the residual small amount of static point cloud, so that the final filtering effect is enhanced; the point cloud is classified into an ellipse, a rectangle and a straight line in a fuzzy way by extracting the angle information of the convex hull of the point cloud, and then the correct point cloud position point is obtained by assisting the size of the point cloud fitting graph, so that the accuracy of data association is ensured; the characteristics of the nearest neighbor domain and the multi-target hypothesis algorithm are also integrated, the multi-target association algorithm is improved, and the data association work can be efficiently completed on the premise of ensuring the accuracy.

Description

Dynamic obstacle tracking method based on sparse laser radar data
Technical Field
The invention relates to the technical field of dynamic obstacle tracking, in particular to a dynamic obstacle tracking method and system based on sparse laser radar data.
Background
Object detection and tracking technologies based on visual sensors have become more sophisticated, but visual sensors often fail to provide distance information and must be used in well-lit conditions, limiting their use in robotic navigation. The laser radar is not influenced by illumination and can provide distance data, and the tasks of robot navigation, tracking and the like can be completed. In recent years, detection of a dynamic obstacle and prediction of a trajectory by a laser radar have been important issues in the field of mobile robots.
With the development of unmanned technology, there have been a number of methods for detecting and tracking dynamic obstacles. For the detection and tracking of vehicles, Konrad M et al use a grid map to detect and track automobiles. Zan Tong child proposes a new global column coordinate histogram feature for vehicle identification in urban environment, a dynamic vehicle detection and tracking algorithm based on a likelihood field model is used for tracking vehicles, Anna Petrovskaya of Stanford university and the like extract the geometric and dynamic characteristics of the tracked vehicles, establish corresponding feature models, and update the tracked target by using a Bayes filter, but the blocked vehicles still cannot be detected. The method for matching the outline shape characteristics of the obstacle by using the template matching method is used in the flying industry, but the types of models are few, and the adaptability of obstacle matching is poor. And the computing method comprises the steps of clustering single-frame laser radar point cloud data, extracting outline characteristics of an external rectangle of an obstacle, performing data association on obstacle information of two continuous frames by adopting a multi-hypothesis tracking Model (MHT) algorithm, and continuously predicting and tracking a dynamic obstacle by utilizing a Kalman filtering algorithm.
In the field of unmanned driving, pedestrian movement is considered in addition to avoidance of large obstacles such as vehicles. Pedestrian detection and tracking techniques have also been studied in recent years. The traditional pedestrian detection method mainly depends on images acquired by sensors such as a visible light camera and the like, and adopts a machine vision related method to detect pedestrians. Premebida et al propose a lidar-based 15-dimensional feature for pedestrian detection in urban environments. Some of these features are, for example, the minimum reflection distance, the number of clustered points in the class, and the like. Wan et al propose a pedestrian detection method based on laser radar and image fusion, which can utilize fusion features to detect pedestrians. Kocelanfeng and the like can cluster non-ground laser radar point clouds by using a dbscan algorithm, and provides a distribution characteristic of a fast point characteristic histogram for training a support vector machine classifier to detect pedestrians.
Three-dimensional laser radar is generally used for detecting and tracking in the field of unmanned driving, but for the closed environment of a mine, the data volume acquired by the three-dimensional laser radar is large, and the omnibearing mobile inspection robot with high real-time requirement on data is a great burden. And the two-dimensional laser radar collects point clouds of a horizontal plane, so that the data volume is small, and the processing is quicker. However, the point cloud is sparse due to a small amount of data, and the accuracy of processing results obtained by using the sparse data is deteriorated. Accuracy is a first concern in data processing when using two-dimensional radars as sensors. In the underground mine, a vision sensor cannot be used for assistance due to illumination, and the three-dimensional radar in the closed environment of the mine has large data volume and low processing speed, so that the two-dimensional radar is finally selected as the sensor to assist the mobile robot to complete detection and tracking tasks. The two-dimensional laser radar acquires point clouds on a horizontal plane, the point clouds are sparse due to small data volume, and accuracy of processing results obtained by using the sparse data is poor.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: how to ensure that the two-dimensional radar sparse data can be used for obtaining the correct tracking effect and reduce the data error in the data processing process, provides a dynamic obstacle tracking method based on sparse laser radar data.
The invention solves the technical problems through the following technical scheme, and the invention comprises the following steps:
s1: eliminating static background
Determining a travelable area, deleting the ground point cloud of the travelable area and the point cloud exceeding a specified height, and obtaining the residual point cloud as the obstacle point cloud;
s2: dynamic point cloud clustering
Clustering the obstacle point clouds in the step S1, and dividing the point clouds belonging to different dynamic obstacles into different clusters;
s3: convex hull extraction
After the clustering of step S2, extracting a convex hull of the clustered point cloud;
s4: feature extraction
Carrying out fuzzy classification by utilizing the angle information of the convex hull, and fitting the point clouds to obtain position information of the obstacle;
s5: data association
Associating positions belonging to the same barrier at different moments according to position information of the barriers in different frames, fitting the positions into a track, associating the tracked track with the clustered point cloud detected by the current data frame, and updating the motion track of the dynamic barrier;
s6: predicting trajectories
Predicting the position of the dynamic obstacle using a trajectory function: increasing x of x value of last track point according to track functionpSubstituting into the formula to find ypCoordinate (x)p,yp) Namely predicting the track point, the formula brought in during the straight line prediction is as follows:
Figure BDA0002416385580000021
wherein (x)i,yi) Is a locus of a fitted straight line, a0To fit the slope of the straight line, a1For the intercept of a fitted straight line, R2Summing the distance differences of the locus points and the fitted straight line;
the formula carried in when predicting a polynomial curve is as follows:
Figure BDA0002416385580000031
wherein (x)i,yi) Is a locus point of a fitting cubic polynomial, a0、a1、a2、a3To fit coefficients of a cubic polynomial, R2Summing the distance differences of the locus points and the fitted straight line;
the trajectory functions include a straight line prediction function and a polynomial curve prediction function, and the straight line prediction function is as follows:
xp=x[n]+(len)/(t*n)
wherein, X [ n ] is the current track point, len is the sum of the increment absolute values of the X coordinate of the existing track point, t is an adjustable parameter, n is the number of the existing track points, and the possible X transformation degree at the current moment is estimated according to the average value of the X transformation degree of the existing track in the straight line prediction;
the polynomial curve prediction function is as follows:
xp=x[n]+len/(t*n)*k
wherein X [ n ] is the current track point, t is an adjustable parameter, n is the number of the existing track points, len is the sum of Euclidean distances of the existing track points, k is the slope of the current track, and in the polynomial curve prediction, the X increment of the current moment is estimated according to the average value of the X increments among the existing track points.
Further, in the step S1, the point cloud only includes coordinate information, and the coordinate is in the form of two-dimensional coordinates (x, y).
Further, the specific processing procedure in S1 of the step includes the following steps:
s11: determining an acquisition plane
Setting the height of the acquisition plane to be 0.8-1.0 m according to the height of the main obstacle;
s12: generating a template
Constructing a grid map or converting an existing mine map into the grid map, wherein the grid map is divided into an obstacle layer, an expansion layer and a travelable layer;
s13: pretreatment of
Loading the grid map built in the step S12, setting the thickness of the expansion layer to be 0.05 m, receiving the original data of the laser radar, processing the point cloud, and deleting the illegal value in the point cloud;
s14: filtering point cloud
And comparing the point coordinates with the grid map to judge whether the point is positioned on a travelable layer, and if the point is positioned on the travelable layer, keeping the point or deleting the point.
Further, in step S2, the point cloud is divided into a core point and a boundary point in the clustering process, where the core point is a clustered skeleton, and the clustering process involves two core parameters, respectively Eps and MinPts, and Eps is a boundary formula of a neighborhood between the point clouds, as shown below:
Ne={xi∈D|distance(xi,xj)<Eps}
wherein x isiRepresenting points in a point cloud, distance (x)i,yi) The Euclidean distance between point clouds is shown, Eps is a distance threshold value of the clustering algorithm classified into the same type of point clouds, and D is a range field;
when the distance between the two points is smaller than Eps, judging that the two points are in the same neighborhood; MinPts is a parameter for judging the core point, and if the number of points with the distance in the neighborhood of the points smaller than Eps is larger than MinPts, the points are judged to be the core points.
Further, the clustering process includes the steps of:
s21: counting the number of points in each point neighborhood, storing the number into pts attributes, judging whether each point pts attribute is larger than MinPts, if so, judging that the point is a core point, and storing the core point into a core point array;
s22: traversing the array of core points concatenates core points within the neighborhood of the current core point into one class,
s23: and traversing all the boundary points, and merging the boundary points in the neighborhood of the core point into the core point class.
Further, in step S4, the moving objects in the mine environment are pedestrians and mine cars, the pedestrians are classified into elliptical point clouds, the mine cars can be classified into rectangular point clouds and linear point clouds when viewed from different angles, and the moving objects are classified into elliptical, rectangular and linear points based on the characteristics of the moving objects and the angle information of the convex hulls.
Further, in step S4, the included angle information of the point cloud is obtained by a convex hull, and after a group of convex hull points of the point cloud is obtained, a vertex in which the closest point to the laser radar is an angle is selected, and angle information using the closest point as the vertex is counted and used as the angle feature of the point cloud, that is, the included angle information.
Further, the specific processing process of the point cloud angular features comprises the following steps:
s41: taking the point closest to the laser radar in the obtained convex hull array as a vertex;
s42: taking a point at the lower left corner in the convex hull array as a starting point, starting clockwise as an end point of one side of the angle, starting anticlockwise, taking a convex hull point as an end point of the other side of the angle, calculating the angle, and counting the maximum value of the angle and summing;
s43: obtaining an angle classification standard through angle statistics after point cloud angle features of the dynamic barrier are extracted, and determining an angle type according to the angle classification standard;
s44: and judging the source of the point cloud data according to the angle type, fitting different types of graphic point clouds by using different fitting formulas according to different driving characteristics of different sources, and further calculating the set characteristics of the graphic point clouds.
Furthermore, an ellipse fitting formula is used for fitting the ellipse to obtain the center of the circle, the major axis, the minor axis and the deflection angle of the ellipse, and the ellipse fitting formula is as follows:
f=∑xi 2+Axiyi+Byi 2+Cxi+Dyi+E
wherein (x)i,yi) For the points of the fitted ellipse, A, B, C, D, E is the coefficients of the non-standard equation of the fitted ellipse;
and fitting a straight line by using a straight line fitting formula to obtain the length, the coordinates of the starting point, the coordinates of the ending point and the coordinates of the central point, wherein the straight line fitting formula is as follows:
Figure BDA0002416385580000051
wherein (x)i,yi) Is the locus point of the fitting straight line, a is the slope of the fitting straight line, b is the intercept of the fitting straight line, R2Summing the distance differences of the locus points and the fitted straight line;
obtaining coordinates of a center point, a length, a width and four corner points of a rectangle by using the existing rectangle fitting function; the point representing the position is then determined based on the difference between the pedestrian and the mine car, and the position of the vehicle is represented by a right-angle point formed by the front end of the car and the side near the radar when the position point is determined. Represents an ellipse of the pedestrian, and takes the center point of the ellipse as the position point of the pedestrian.
The invention also provides a dynamic obstacle tracking system based on sparse lidar data, which comprises:
the elimination module is used for determining a travelable area and deleting ground point clouds and point clouds exceeding a certain height in the travelable area;
the clustering module is used for clustering the obstacle point clouds and dividing the point clouds belonging to different dynamic obstacles into different clusters;
the convex hull extracting module is used for extracting a convex hull of the clustered point cloud after clustering;
the characteristic extraction module is used for carrying out fuzzy classification by utilizing the angle information of the convex hull and then fitting the point clouds to obtain the position information of the obstacle;
the data association module is used for associating positions at different moments belonging to the same barrier according to the position information of the barrier in different frames, fitting the positions into a track, associating the tracked track with the clustered point cloud detected by the current data frame, and updating the motion track of the dynamic barrier;
the track prediction module is used for predicting the position of the dynamic obstacle by utilizing a track function;
the control module is used for controlling the other modules to execute instructions;
the elimination module, the clustering module, the convex hull extraction module, the feature extraction module, the data association module and the track prediction module are all electrically connected with the control module.
Compared with the prior art, the invention has the following advantages: according to the dynamic obstacle tracking method and system based on the sparse lidar data, the static obstacle lidar point cloud is filtered by using the grid map, and the extremely sparse point cloud is filtered by the remaining few static point clouds through a dbscan algorithm, so that the final filtering effect is enhanced; the point cloud is classified into an ellipse, a rectangle and a straight line in a fuzzy way by extracting the angle information of the convex hull of the point cloud, and then the correct point cloud position point is obtained by assisting the size of the point cloud fitting graph, so that the accuracy of data association is ensured; the characteristics of the nearest neighborhood and the multi-target hypothesis algorithm are integrated, the multi-target association algorithm is improved, and the data association work can be efficiently completed on the premise of ensuring the accuracy.
Drawings
FIG. 1 is a general flow chart of a method for dynamic obstacle tracking based on sparse lidar data in an embodiment of the present invention;
FIG. 2(a) is a grid map in an embodiment of the present invention; FIG. 2(b) is an original point cloud in an embodiment of the present invention; FIG. 2(c) is a diagram of the effect of the embodiment of the present invention after actual data acquisition and background removal;
FIG. 3 is a diagram illustrating the clustering effect of the point clouds according to the embodiment of the present invention;
FIG. 4(a) is a cloud of points with fuzzy classification as ellipses in an embodiment of the present invention; FIG. 4(b) is a cloud point diagram with fuzzy classification as straight lines in an embodiment of the present invention; FIG. 4(c) is a cloud point diagram with fuzzy classification as rectangles in an embodiment of the present invention;
FIG. 5 is a diagram illustrating the effect of convex hull extraction according to an embodiment of the present invention;
FIG. 6(a) is a rectangular boundary diagram formed by the vehicle head and the point cloud on the side close to the radar in the embodiment of the invention; FIG. 6(b) is a straight line boundary diagram formed by point clouds of a vehicle near a radar side in the embodiment of the invention; FIG. 6(c) is a rectangular boundary diagram of the vehicle tail and the point cloud on the side near the radar in the embodiment of the present invention;
FIG. 7 is a schematic diagram of a process of calculating a front end right-angle point of a vehicle according to an embodiment of the present invention;
FIG. 8 is a flow chart illustrating track update matching according to an embodiment of the present invention;
FIG. 9(a) is a plot of a straight line fit trajectory in an embodiment of the present invention; FIG. 9(b) is a plot of a polynomial fit trajectory in an embodiment of the present invention.
Detailed Description
The following examples are given for the detailed implementation and specific operation of the present invention, but the scope of the present invention is not limited to the following examples.
The embodiment provides a technical scheme: a dynamic obstacle tracking method based on sparse lidar data, as shown in fig. 1, includes the following steps:
the first step is as follows: eliminating static background
The data collected by the laser radar comprises static obstacles and dynamic obstacles, and for the tracking of the dynamic obstacles, the static obstacle point cloud is noise, and the part of the point cloud is to be removed. The general process of background elimination based on three-dimensional radar is as follows: firstly, determining a travelable area, then deleting a ground point cloud and a point cloud exceeding a certain height of the travelable area, and obtaining a residual point cloud which is an obstacle point cloud. This process can delete ground point clouds, overly high ambient point clouds but still have a partially static point cloud retention. The two-dimensional radar has a small data amount, and noise is reduced as much as possible in order to achieve a desired effect.
The specific processing procedure adopted in this embodiment is as follows:
1) and determining an acquisition plane, wherein the two-dimensional radar can only acquire information of one horizontal plane, and when the horizontal plane is determined, the height of a main obstacle must be considered, and the height of the general mine mobile equipment is 0.8-2.4 meters and the leg length of a pedestrian is generally 0.8-1.1 meters by looking up related data, so that the acquisition plane is set to be 0.8-1.0 meter.
2) And generating a template, and constructing a grid map by using a mapping function packet or converting an existing mine map into the grid map, wherein the grid map is divided into an obstacle layer, an expansion layer and a travelable layer.
3) Preprocessing, namely loading the built grid map, and accurately filtering static point cloud when the distance between the dynamic barrier and the static barrier is more than 0.1 m; when the distance between the dynamic barrier and the static barrier is 0-0.1 m, part of dynamic point cloud can be filtered due to the expansion layer, so that the mistaken deletion occurring here is reduced by reducing the size of the expansion layer, the thickness of the expansion layer is set to be 0.05 m in the embodiment, and the original data of the laser radar is received.
Since the data coordinate system collected by the lidar is referred to as "laser" and the static map coordinate system is referred to as "map", the lidar data is then converted to the map coordinate system. The point cloud data is then processed to remove illegal values such as inf and invalid values such as (0, 0) from the point cloud.
4) Filtering the point cloud, comparing the point cloud with a grid map to judge whether the point cloud is in a travelable layer, and if the point cloud is in the travelable layer, keeping the point cloud, otherwise, deleting the point cloud; the point cloud of the obstacle will float small in size because of the instability of the lidar data and is also considered a static obstacle for the point cloud at the dilated layer.
The created grid map is shown in fig. 2(a), the original point cloud is shown in fig. 2(b), after actual data acquisition and background elimination, the effect map is shown in fig. 2(c), and most of the static point cloud is filtered through background elimination. Even if a small part of static point cloud is still remained, the static point cloud remained after processing is distributed sparsely, and can be completely filtered through a subsequent clustering algorithm. The interference of static point cloud is eliminated, the data volume is further reduced, and the subsequent algorithm processing is quicker; and the obstacle can be more accurately tracked without static point cloud interference.
The second step is that: dynamic point cloud clustering
The dynamic point cloud contains data of a plurality of dynamic obstacles, and the point cloud has only coordinate information. In order to achieve tracking of dynamic obstacles, point clouds belonging to the same obstacle must be classified into one category. Therefore, comprehensive characteristics of a point cloud can be extracted to serve as characteristics of the dynamic obstacle. Two commonly used clustering algorithms DBSCAN and K-means are compared. k-means requires the number of clusters to be specified, k, and the initial cluster center has a large influence on the clustering. k-means classifies any point into a certain class and is sensitive to abnormal points. The DBSCAN can eliminate noise, a neighborhood distance threshold value eps and a sample number threshold value MinPts need to be appointed, and the cluster number can be automatically determined. Because the number of dynamic obstacles is uncertain and there is a significant difference between point clouds belonging to the same obstacle and point clouds belonging to different obstacles, the DBSCAN algorithm is selected for clustering in this embodiment.
DBSCAN is a density-based clustering algorithm that generally assumes that classes are determined by how closely the samples are distributed. The distance between the point clouds of the samples in the same category is close, namely the samples in the same category exist in a short distance around any sample in the category, the point clouds are divided into core points and boundary points by a clustering algorithm, and the core points are clustered frameworks. The algorithm has two core parameters Eps, MinPts. Eps is the boundary formula for the neighborhood between point clouds, as follows:
Ne={xi∈D|distance(xi,xj)<Eps} (1)
the distance between two points is less than Eps, then it is considered to be in the same neighborhood. In addition, MinPts is a parameter for determining the core point, xiRepresenting points in a point cloud, distance (x)i,yi) In the mathematical domain, ∈ neighborhood, the area within a given object radius ∈ is called the ∈ neighborhood of the object.
The clustering process is as follows: firstly, counting the number of points in each point neighborhood, storing the number into pts attributes, then counting each point pts attributes, judging whether the number is greater than MinPts, if so, determining a core point, and storing the core point array; secondly, the core point array is traversed in a depth mode, and the core points in the neighborhood range of the current core point are connected in series to form a class; and finally, traversing all the boundary points, and merging the boundary points in the neighborhood of the core point into the core point class. The effect graph after clustering is shown in fig. 3, three obstacles are arranged in the view field, and two pedestrian obstacles and one vehicle-like obstacle are shown in fig. 3.
The result shows that the point clouds belonging to different dynamic obstacles can be divided into different clusters after being clustered by the DNSCAN algorithm. And the method has a filtering function on static obstacle point clouds which are not completely filtered, and the point clouds are few in number, are not distributed in a concentrated mode and cannot be classified. After clustering, the part of the point cloud is filtered by deleting the point cloud without classification.
The third step: convex hull extraction
The method includes the steps of clustering a point cloud, using each class as a detected moving object, and then associating moving object data belonging to different frames to determine a track of the moving object, wherein feature information of each class exists before, a convex hull (covex hull) of a point set Q refers to a minimum convex polygon, and points in Q are satisfied and are on sides of or in the polygon, a polygon represented by line segments in FIG. 5 is a convex hull of the point set Q ═ p0, p 1.. p12 }. the outline shape of the clustered point cloud is a main feature of the classification, and a convex hull of the clustered point cloud is obtained by using a graham scanning algorithm, and the length | a × b | of the cross product can be interpreted as the area of a parallelogram formed when the two cross product vectors a and b share a starting point.
The cross product formula of the two-dimensional vector is as follows:
Figure BDA0002416385580000081
if p1Xp2 is less than zero, p1 to p2 are counterclockwise, and if p1Xp2 is greater than zero, p1 to p2 are clockwise.
And enveloping all point clouds in the same group in a polygonal form by the convex hull extracted by the graham algorithm. The convex hull reflects the outer contour information of the dynamic barrier, and then the geometric characteristics of the dynamic barrier are extracted.
The fourth step: feature extraction
The most critical to achieving tracking of dynamic obstacles is to associate the positions of the obstacles at different times into a group. The present embodiment provides a way to correlate data based on the observation of a two-dimensional point cloud. The main basis for the correlation data is the geometry of the dynamic obstacle. The data processing is divided into two steps: firstly, carrying out fuzzy classification on the obstacles into ellipses, rectangles and straight lines according to the extracted angle information of the convex hull; and then, fitting different types of point clouds, and extracting interested position points and geometric features.
The main moving objects in the mine environment are pedestrians and mine cars, the pedestrians are similar to an ellipse, and the mine cars can be classified into rectangular point clouds and linear point clouds when observed from different angles. Arbitrary point clouds can be classified into these three geometries. According to the fact, a calculation method divides the point cloud after clustering into three types of 'e' (ellipse), 'r' (rectangle) and 'l' (straight line).
Data obtained by the radar can only be point clouds of one direction of a moving object, as shown in fig. 4, the point clouds of one direction only reflect partial geometrical characteristics of an obstacle, and the most prominent geometrical characteristic is included angle information. For pedestrians, the point clouds in all directions are similar to circular arcs, and vehicles can be summarized as rectangles. The fuzzy classification of the point cloud according to the information of the included angle is shown in fig. 4. Point clouds like 4(a) are divided into ellipses, point clouds like 4(b) are divided into straight lines, and point clouds like 4(c) are divided into rectangles.
The angle information of the point cloud is obtained by convex hull, as shown in fig. 5.
And obtaining a group of convex package points of the point cloud after calculation of the graham algorithm. And observing and finding according to radar data of actual vehicles and pedestrians. The closest point to the lidar is typically the vertex of the angle. And counting angle information taking the closest point as a vertex, and taking the angle information as the angle characteristic of the point cloud. The specific treatment process is as follows:
1) and taking the point closest to the laser radar in the obtained convex hull array as a vertex.
2) And (3) taking the point at the lower left corner in the convex hull array as a starting point, starting clockwise as an end point of one side of the corner, starting anticlockwise, taking the convex hull point as an end point of the other side, calculating the angle by using a formula (3), counting the maximum value of the angle, summing, and considering the angle to be invalid when the end point is 0.05 meters away from the vertex in the process of traversing all the convex hull points. A large number of angle operation results are summarized, and the classification range of the angle pair clustering point clouds is shown in table 1:
TABLE 1 Angle Classification Standard Table
Figure BDA0002416385580000091
3) And obtaining the classification standard shown in the table 1 through angle statistics after the radar point cloud angle features of the pedestrians and the vehicle-like obstacles are extracted. Firstly, the average angle value is used as a basis for judgment, and if the average angle value is within the interval of 0.7-3.0, the maximum angle value is used as a basis, experiments prove that the accuracy of the method is about 90%.
4) After the type is determined, whether the point cloud data is a source vehicle or a pedestrian can be determined, and the point cloud is further processed according to different driving characteristics of the vehicle and the pedestrian.
And fitting different types of graphic point clouds by using different fitting formulas so as to calculate the set characteristics of the graphic point clouds. And (4) fitting the ellipse by using a formula (3) to obtain the center, the long axis, the short axis and the deflection angle of the ellipse. And (4) fitting a straight line by using a formula (4) to obtain the length, the coordinates of the starting point, the coordinates of the ending point and the coordinates of the central point. And obtaining coordinates of a center point, a length, a width and four corner points of the rectangle by using a rectangle fitting function provided by opencv, and then determining points representing positions according to differences of pedestrians and vehicles.
f=∑((xi-xc)2+(yi+yc)2-R2)2(3)
Figure BDA0002416385580000101
And (3) processing the vehicle: as shown in FIG. 6, assuming a mine car is traveling, the corresponding lidar data will undergo three state changes, the first case is shown in FIG. 6(a), where the car is near the rectangular boundary formed by the point cloud on the head and near the radar side, the second case is shown in FIG. 6(b), where the car is near the straight boundary formed by the point cloud on the radar side, and the third case is shown in FIG. 6(c), where the car is near the rectangular boundary formed by the point cloud on the tail and near the radar side.
The present embodiment selects the right-angle point composed of the vehicle front end and the side near the radar side to represent the position of the vehicle, because the points at which the vehicle can most reflect the wheel profile information in this process are the right-angle point a at the vehicle front end and the right-angle point b at the vehicle rear end. These two points reflect the front-most and rear-most ends of the vehicle, respectively, and collectively reflect the sides of the vehicle. Then, the length len of the vehicle can be obtained after the straight line state of the vehicle is collected, and the position of the point a can be calculated according to the position of the point b according to the length len, namely the front end right-angle point can be obtained under three conditions. The process of estimating the front end right-angle point is shown in fig. 7. The method comprises the following steps:
1) and determining the driving direction of the vehicle, and performing vector difference according to the position of the rectangular central point of the current position of the vehicle and the position of the rectangular central point at the last moment. A vector a of the vehicle's direction of travel is obtained. If the current state is a straight line state, the difference between the starting point and the ending point of the straight line is respectively used as a vector A;
2) determining the running state of the vehicle, and obtaining a vector B by using the difference between the center point of the current fitted rectangle and the collected right-angle point;
3) and calculating an angle between the vector A and the vector B, and if the angle is more than 90 degrees, considering that the center point of the rectangle is in the third state before the right-angle point. If the central point of the rectangle is less than 90, the central point of the rectangle is considered to be a first state after the right-angle point;
4) in the third state, the right-angle point a is calculated by using the formulas (5) and (6).
Xa=Xb+len*cos(arctan(angle)) (5)
Xa=Xb+len*sin(arctan(angle)) (6)
And (3) processing the pedestrian: the center of the ellipse represents the position of the pedestrian. Because the geometric characteristics of the pedestrians are similar to an ellipse, the floor area of a single pedestrian is within 1 square meter, the size of the ellipse representing the pedestrians is small, and the distance between the center of the ellipse and the outline is short, the center of the ellipse represents the position of the obstacle of the pedestrian is feasible.
And obtaining dynamic obstacle type information and position information within an error allowable range through feature extraction, and then performing data association according to the obtained reliable information.
The fifth step: data association
After the above operation, the original point cloud data is converted into a characteristic graph as shown in table 2. The data association is to associate the tracked track with the clustering point cloud detected by the current data frame so as to update the motion track of the dynamic obstacle. The invention combines Nearest Neighbor (NN) and multi-target hypothesis tracking (MHT) algorithms. A multi-target association method based on the weight distance is provided.
TABLE 2 Point cloud characteristic information Table
Figure BDA0002416385580000111
Common data association algorithms include nearest neighbor algorithm and multi-target hypothesis tracking algorithm. The multi-target association mode which combines the characteristics of the two algorithms and takes the weight distance as the association basis is called as the weight distance algorithm. The weight distance algorithm gives different weights according to the quantitative characteristics of the point cloud in various aspects and the importance degree of the characteristics in data association. Therefore, the characteristics of the obstacle point cloud are comprehensively considered, and the probability calculation time is simplified.
Distance between obstacle tracking points; the width and length of the obstacle; the included angle between the rectangular boundary of the obstacle and the x coordinate axis; the angle between the rectangular boundary of the obstacle and the x coordinate axis. For the same obstacle, the geometrical size of the point cloud fitting may change at different distances in different directions, that is, the size factor may increase the error. Similarly, the two-dimensional radar sparse point cloud angle information is not stable enough. And finally, the point cloud weight takes the fuzzy classification type of the point cloud and the corrected position as an association basis.
The weight formula is as follows: weight ═ coef [0 ]. W [0] + coef [1] -W [1] (7)
The Coef [ ] array in weight formula (7) is the coefficient of each weight, W [0] is the type weight, if the same type value is 1, the different type value is-1; w2 is the distance weight, the point of the track prediction is the distance from the current frame point cloud, since the closer the distance, the higher the degree of association and the higher the weight. The distance weight is inversely related to the final weight. In the embodiment of the present invention, the (coefficient-distance) is used as the weighting distance.
Referring to the association strategy of the multi-target association algorithm, the track update matching process is discussed in cases as shown in fig. 8.
The matching process mainly considers the following three cases:
in the first case, a newly emerging obstacle is considered to be a newly emerging obstacle if its association weights with all trajectories are less than the successfully associated limit value wei _ limit. And establishing a new track. Since only one location point cannot predict the trajectory, the initial radius pre _ radius is defined to draw a circle as shown in fig. 8, and the distance between the point cloud location and the circle trajectory is taken as the basis for the next time.
In the second case. The obstacle is matched with the trajectory, and if the obstacle and the existing trajectory have the correlation value which is greater than the successful correlation limit value wei _ limit and is the maximum correlation value, the information of the trajectory is updated by the information provided by the obstacle. As there are instances of fuzzy classification errors. Different point cloud types may be included in the trajectory. And counting the number of different types of point clouds in the track during updating, wherein the track type is the type with the largest number. And updating the track type number, the track size and the track position. And the current obstacle information and the predicted track point information are obtained by adding weights according to the thought track position and the size of Kalman filtering.
In the third case, the obstacle disappears, some trajectories do not match the obstacle, and the trajectories are deleted.
And a sixth step: trajectory prediction
And after the characteristic graphs are clustered, storing the central coordinates of the characteristic graphs as track points. For more accurate description and prediction of the motion trajectory, a least squares polynomial curve is preferably used to fit each trajectory, and the formula is as follows:
A*X=Y (8)
Figure BDA0002416385580000121
and (4) bringing a set of trace points to solve the A parameter vector which minimizes the value of the formula (8). And (3) calculating a sufficient number (more than or equal to 4) of track points required by the track by using the polynomial curve least square method of the formula (9), and using a method of fitting a straight line by using the least square method of the formula (10) as a predicted track when the number of the acquired track points is less than 4.
Figure BDA0002416385580000122
The implementation of the track prediction is divided into straight line prediction and polynomial curve prediction, and the basic flow is that firstly, according to the derived fitting function, the x of the x value of the last track point is properly increasedpSubstituting into the formula to find ypCoordinate (x)p,yp) Namely the predicted track points.
xp=x[n]+(len)/(t*n) (11)
xp=x[n]+len/(t*n)*k (12)
Equation (11) is a straight line prediction xpIs calculated by the formula X [ n ]]The current track point is len, the sum of the increment absolute values of the x coordinate of the existing track point is len, t is an adjustable parameter, and n is the number of the existing track points. The embodiment estimates the possible x transformation degree at the current moment according to the average value of the x transformation degrees of the existing tracks. Equation (12) is a calculation equation of polynomial curve prediction xp, X [ n ]]And n, t and the meanings in the straight line prediction are correspondingly the same, len is the sum of Euclidean distances of the existing track points, and k is the slope of the current track. The curve prediction estimates the possible x transformation degree at the current moment by using the average Euclidean distance of the existing track, and the transformation direction of the curve is the tangential direction. The trajectory prediction effect graph is shown in fig. 9, where fig. 9(a) is a straight line-fitted trajectory graph and fig. 9(b) is a polynomial-fitted trajectory graph.
In order to verify the feasibility of the algorithm, a grAN _ SNhing map building algorithm is used for building a grid map of a laboratory, then the algorithm intercepts and captures laser radar data which are directly transmitted to a navigation module, and data processing is carried out to generate the obstacle track prediction point cloud. And transmitting the generated point cloud to a navigation module through a ros system framework. And then, receiving the track point cloud generated by the algorithm for navigation under the assistance of the track point cloud, wherein the navigation result is as follows: compared with the situation that the algorithm is not processed, the track point cloud generated by the algorithm can effectively assist the obstacle avoidance navigation function of the robot.
In summary, in the two sets of dynamic obstacle tracking methods based on sparse lidar data according to the embodiments, the grid map is used to filter the static obstacle lidar point cloud, and the remaining few static point clouds are filtered by the dbscan algorithm to obtain a very sparse point cloud, so that the final filtering effect is enhanced; the point cloud is classified into an ellipse, a rectangle and a straight line in a fuzzy way by extracting the angle information of the convex hull of the point cloud, and then the correct point cloud position point is obtained by assisting the size of the point cloud fitting graph, so that the accuracy of data association is ensured; the characteristics of the nearest neighborhood and the multi-target hypothesis algorithm are integrated, the multi-target association algorithm is improved, the data association work can be efficiently completed on the premise of ensuring the accuracy, and the method is worthy of being popularized and used.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (10)

1. A dynamic obstacle tracking method based on sparse lidar data is characterized by comprising the following steps:
s1: eliminating static background
Determining a travelable area, deleting the ground point cloud of the travelable area and the point cloud exceeding a specified height, and obtaining the residual point cloud as the obstacle point cloud;
s2: dynamic point cloud clustering
Clustering the obstacle point clouds in the step S1, and dividing the point clouds belonging to different dynamic obstacles into different clusters;
s3: convex hull extraction
After the clustering of step S2, extracting a convex hull of the clustered point cloud;
s4: feature extraction
Carrying out fuzzy classification by utilizing the angle information of the convex hull, and fitting the point clouds to obtain position information of the obstacle;
s5: data association
Associating positions belonging to the same barrier at different moments according to position information of the barriers in different frames, fitting the positions into a track, associating the tracked track with the clustered point cloud detected by the current data frame, and updating the motion track of the dynamic barrier;
s6: predicting trajectories
Predicting the position of the dynamic obstacle using a trajectory function: fitting the track points by using a least square method to obtain a track function, and adding a proper increment x on the abscissa basis of the last track point according to the track function to obtain xpSubstituting into the formula to find ypCoordinate (x)p,yp) I.e. predicting the locus points, straight linesThe least squares equations introduced in the prediction are as follows:
Figure FDA0002416385570000011
wherein (x)i,yi) Is a locus of a fitted straight line, a1Is the slope of the fitted line, a0Is the intercept of a fitted straight line, R2For the sum of the differences in distance between the locus points and the fitted straight line, the polynomial curve prediction brings the following formula:
Figure FDA0002416385570000012
wherein (x)i,yi) Is a locus point of a fitting cubic polynomial, a0、a1、a2、a3To fit coefficients of a cubic polynomial, R2Summing the distance differences of the locus points and the fitted straight line;
the trajectory functions include a straight line prediction function and a polynomial curve prediction function, and the straight line prediction function is as follows:
xp=x[n]+(len)/(t*n)
wherein, X [ n ] is the current track point, len is the sum of the increment absolute values of the X coordinates of the existing track points, t is an adjustable parameter, n is the number of the existing track points, and the X increment of the current moment is estimated according to the average value of the X increments among the existing track points in the straight line prediction; the polynomial curve prediction function is as follows:
xp=x[n]+len/(t*n)*k
wherein X [ n ] is the current track point, t is an adjustable parameter, n is the number of the existing track points, len is the sum of Euclidean distances of the existing track points, k is the slope of the current track, and the X increment of the current moment is estimated according to the average value of the X increments among the existing track points in the polynomial curve prediction.
2. The dynamic obstacle tracking method based on sparse lidar data of claim 1, wherein: the point cloud contains only coordinate information in the form of two-dimensional coordinates (x, y).
3. The dynamic obstacle tracking method based on sparse lidar data of claim 1, wherein the specific processing procedure in step S1 comprises the following steps:
s11: determining an acquisition plane
Setting the height of the acquisition plane to be 0.8-1.0 m according to the height of the main obstacle;
s12: generating a template
Constructing a grid map or converting an existing mine map into the grid map, wherein the grid map is divided into an obstacle layer, an expansion layer and a travelable layer;
s13: pretreatment of
Loading the grid map built in the step S12, setting the thickness of the expansion layer to be 0.05 m, receiving the original data of the laser radar, processing the point cloud, and deleting the illegal value in the point cloud;
s14: filtering point cloud
And comparing the point coordinates with the grid map to judge whether the point is positioned on a travelable layer, and if the point is positioned on the travelable layer, keeping the point or deleting the point.
4. The dynamic obstacle tracking method based on sparse lidar data of claim 1, wherein in step S2, the point cloud is divided into a core point and a boundary point in the clustering process, wherein the core point is a clustered skeleton, and two core parameters are involved in the clustering process, respectively Eps, MinPts, Eps is a boundary formula of a neighborhood between the point clouds, as follows:
Ne={xi∈D|distance(xi,xj)<Eps}
wherein x isiRepresenting points in a point cloud, distance (x)i,yi) The Euclidean distance between point clouds is shown, Eps is a distance threshold value of the clustering algorithm classified into the same type of point clouds, and D is a range field;
when the distance between the two points is less than Eps, the two points are considered to belong to the same point cloud; MinPts is a parameter for judging the core point, and if the number of points with the distance in the neighborhood of the points smaller than Eps is larger than MinPts, the points are judged to be the core points.
5. The sparse lidar data-based dynamic obstacle tracking method according to claim 4, wherein the clustering process comprises the following steps:
s21: counting the number of points in each point neighborhood, storing the number into pts attributes, judging whether each point pts attribute is larger than MinPts, if so, judging that the point is a core point, and storing the core point into a core point array;
s22: traversing the core point array to serially connect the core points in the neighborhood range of the current core point into one class;
s23: and traversing all the boundary points, and merging the boundary points in the neighborhood of the core point into the core point class.
6. The dynamic obstacle tracking method based on sparse lidar data of claim 1, wherein: in step S4, the moving objects in the mine environment are pedestrians and mine cars, the pedestrians are classified into elliptical point clouds, the mine cars can be classified into rectangular point clouds and linear point clouds when viewed from different angles, and the moving objects are classified into ellipses, rectangles, and linear points based on the characteristics of the moving objects by using the angle information of the convex hulls.
7. The method of claim 6, wherein the method comprises the following steps: in step S4, the included angle information of the point cloud is obtained by a convex hull, and after a group of convex hull points of the point cloud is obtained, a vertex in which the closest point to the laser radar is an angle is selected, and angle information using the closest point as the vertex is counted and used as the angle feature of the point cloud, that is, included angle information.
8. The dynamic obstacle tracking method based on sparse lidar data of claim 7, wherein the specific processing procedure of the point cloud angular features comprises the following steps:
s41: taking the point closest to the laser radar in the obtained convex hull array as a vertex;
s42: taking a point at the lower left corner in the convex hull array as a starting point, starting clockwise as an end point of one side of the angle, starting anticlockwise, taking a convex hull point as an end point of the other side of the angle, calculating the angle, and counting the maximum value of the angle and summing;
s43: obtaining an angle classification standard through angle statistics after point cloud angle features of the dynamic barrier are extracted, and determining an angle type according to the angle classification standard;
s44: and judging the source of the point cloud data according to the angle type, fitting different types of graphic point clouds by using different fitting formulas according to different driving characteristics of different sources, and further calculating the set characteristics of the graphic point clouds.
9. The method for tracking dynamic obstacle based on sparse lidar data of claim 8, wherein in step S44, the ellipse is fitted with an ellipse fitting formula to obtain a center, a major axis, a minor axis, and a deflection angle of the ellipse, wherein the ellipse fitting formula is as follows:
f=∑xi 2+Axiyi+Byi 2+Cxi+Dyi+E
wherein (x)i,yi) For the points of the fitted ellipse, A, B, C, D, E is the coefficients of the non-standard equation of the fitted ellipse;
and fitting a straight line by using a straight line fitting formula to obtain the length, the coordinates of the starting point, the coordinates of the ending point and the coordinates of the central point, wherein the straight line fitting formula is as follows:
Figure FDA0002416385570000031
wherein (x)i,yi) Is the locus point of the fitted straight line, a is the slope of the fitted straight line, b is the intercept of the fitted straight line, R2Summing the distance differences of the locus points and the fitted straight line;
obtaining coordinates of a center point, a length, a width and four corner points of a rectangle by using the existing rectangle fitting function; and then determining a point representing the position according to the difference between the pedestrian and the mine car, wherein when the position point is determined, the position of the vehicle is represented by a right-angle point formed by the front end of the car and the side close to the radar, an ellipse of the pedestrian is represented, and the central point of the ellipse is used as the position point of the pedestrian.
10. A dynamic obstacle tracking system based on sparse lidar data, wherein the tracking of a dynamic obstacle by the tracking method according to any one of claims 1 to 9 comprises:
the elimination module is used for determining a travelable area and deleting ground point clouds and point clouds exceeding a certain height in the travelable area;
the clustering module is used for clustering the obstacle point clouds and dividing the point clouds belonging to different dynamic obstacles into different clusters;
the convex hull extracting module is used for extracting a convex hull of the clustered point cloud after clustering;
the characteristic extraction module is used for carrying out fuzzy classification by utilizing the angle information of the convex hull and then fitting the point clouds to obtain the position information of the obstacle;
the data association module is used for associating positions at different moments belonging to the same barrier according to the position information of the barrier in different frames, fitting the positions into a track, associating the tracked track with the clustered point cloud detected by the current data frame, and updating the motion track of the dynamic barrier;
the track prediction module is used for predicting the position of the dynamic obstacle by utilizing a track function;
the control module is used for controlling the other modules to execute instructions;
the elimination module, the clustering module, the convex hull extraction module, the feature extraction module, the data association module and the track prediction module are all electrically connected with the control module.
CN202010192403.6A 2020-03-18 2020-03-18 Dynamic obstacle tracking method based on sparse laser radar data Active CN111337941B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010192403.6A CN111337941B (en) 2020-03-18 2020-03-18 Dynamic obstacle tracking method based on sparse laser radar data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010192403.6A CN111337941B (en) 2020-03-18 2020-03-18 Dynamic obstacle tracking method based on sparse laser radar data

Publications (2)

Publication Number Publication Date
CN111337941A true CN111337941A (en) 2020-06-26
CN111337941B CN111337941B (en) 2022-03-04

Family

ID=71184332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010192403.6A Active CN111337941B (en) 2020-03-18 2020-03-18 Dynamic obstacle tracking method based on sparse laser radar data

Country Status (1)

Country Link
CN (1) CN111337941B (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111708021A (en) * 2020-07-15 2020-09-25 四川长虹电器股份有限公司 Personnel tracking and identifying algorithm based on millimeter wave radar
CN111813120A (en) * 2020-07-10 2020-10-23 北京林业大学 Method and device for identifying moving target of robot and electronic equipment
CN111860321A (en) * 2020-07-20 2020-10-30 浙江光珀智能科技有限公司 Obstacle identification method and system
CN111866305A (en) * 2020-08-11 2020-10-30 普达迪泰(天津)智能装备科技有限公司 Image enhancement and environment adaptability method under indoor and outdoor specific conditions
CN111915657A (en) * 2020-07-08 2020-11-10 浙江大华技术股份有限公司 Point cloud registration method and device, electronic equipment and storage medium
CN111929676A (en) * 2020-07-30 2020-11-13 上海交通大学 X-band radar target detection and tracking method based on density clustering
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system
CN111967374A (en) * 2020-08-14 2020-11-20 安徽海博智能科技有限责任公司 Mine obstacle identification method, system and equipment based on image processing
CN112346014A (en) * 2020-09-23 2021-02-09 宁波大学 Multi-base sonar positioning method based on signal arrival time difference
CN112379673A (en) * 2020-11-26 2021-02-19 广东盈峰智能环卫科技有限公司 Robot self-following method and device based on single-line laser radar and robot
CN112505678A (en) * 2020-10-23 2021-03-16 中国第一汽车股份有限公司 Vehicle track calculation method and device, vehicle and medium
CN112529874A (en) * 2020-12-14 2021-03-19 上海智蕙林医疗科技有限公司 Obstacle detection method, device, medium and robot based on three-dimensional radar
CN112562419A (en) * 2020-11-03 2021-03-26 南京航空航天大学 Off-line multi-target tracking-based weather avoidance zone setting method
CN112560580A (en) * 2020-11-20 2021-03-26 腾讯科技(深圳)有限公司 Obstacle recognition method, device, system, storage medium and electronic equipment
CN112666569A (en) * 2020-12-01 2021-04-16 天津优控智行科技有限公司 Compression method of laser radar continuous point cloud of unmanned system
CN112733922A (en) * 2021-01-04 2021-04-30 上海高仙自动化科技发展有限公司 Method and device for determining forbidden area, robot and storage medium
CN112883909A (en) * 2021-03-16 2021-06-01 东软睿驰汽车技术(沈阳)有限公司 Surrounding box-based obstacle position detection method and device and electronic equipment
CN113074748A (en) * 2021-03-29 2021-07-06 北京三快在线科技有限公司 Path planning method and device for unmanned equipment
CN113111887A (en) * 2021-04-26 2021-07-13 河海大学常州校区 Semantic segmentation method and system based on information fusion of camera and laser radar
CN113111973A (en) * 2021-05-10 2021-07-13 北京华捷艾米科技有限公司 Depth camera-based dynamic scene processing method and device
CN113183967A (en) * 2021-06-04 2021-07-30 多伦科技股份有限公司 Vehicle safety control method, device, equipment and storage medium
CN113191459A (en) * 2021-05-27 2021-07-30 山东高速建设管理集团有限公司 Road-side laser radar-based in-transit target classification method
CN113269889A (en) * 2021-06-04 2021-08-17 重庆大学 Self-adaptive point cloud target clustering method based on elliptical domain
CN113313654A (en) * 2021-06-23 2021-08-27 上海西井信息科技有限公司 Laser point cloud filtering and denoising method, system, equipment and storage medium
CN113447953A (en) * 2021-06-29 2021-09-28 山东高速建设管理集团有限公司 Background filtering method based on road traffic point cloud data
CN113536959A (en) * 2021-06-23 2021-10-22 复旦大学 Dynamic obstacle detection method based on stereoscopic vision
CN113759900A (en) * 2021-08-12 2021-12-07 中南大学 Inspection robot track planning and real-time obstacle avoidance method and system based on obstacle area prediction
CN113807239A (en) * 2021-09-15 2021-12-17 京东鲲鹏(江苏)科技有限公司 Point cloud data processing method and device, storage medium and electronic equipment
CN113926174A (en) * 2021-11-16 2022-01-14 南京禾蕴信息科技有限公司 Pile-winding motion trajectory analysis and timing device and analysis method thereof
CN114037736A (en) * 2021-11-15 2022-02-11 重庆大学 Vehicle detection and tracking method based on self-adaptive model
CN114266796A (en) * 2021-11-05 2022-04-01 广东省国土资源测绘院 Method, device, medium and product for automatically acquiring natural shoreline based on laser point cloud data and average large-tide high-tide surface
CN114384492A (en) * 2022-03-24 2022-04-22 北京一径科技有限公司 Point cloud processing method and device for laser radar and storage medium
CN114384491A (en) * 2022-03-24 2022-04-22 北京一径科技有限公司 Point cloud processing method and device for laser radar and storage medium
CN114384920A (en) * 2022-03-23 2022-04-22 安徽大学 Dynamic obstacle avoidance method based on real-time construction of local grid map
CN114460582A (en) * 2021-12-14 2022-05-10 江苏航天大为科技股份有限公司 Millimeter wave radar cart identification method based on point cloud speed
EP4024338A1 (en) * 2020-12-28 2022-07-06 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for generating a map for a robot
CN114743181A (en) * 2022-04-29 2022-07-12 重庆长安汽车股份有限公司 Road vehicle target detection method and system, electronic device and storage medium
CN114973564A (en) * 2022-04-28 2022-08-30 北京机械设备研究所 Remote personnel intrusion detection method and device under non-illumination condition
WO2022243991A1 (en) * 2021-05-03 2022-11-24 Israel Aerospace Industries Ltd. System and method for tracking and classifying moving objects
TWI785421B (en) * 2020-11-24 2022-12-01 大云永續科技股份有限公司 Inventory management method and system
CN115561772A (en) * 2022-08-26 2023-01-03 东莞理工学院 Laser radar driving environment cognitive system based on visual area guidance
WO2023284705A1 (en) * 2021-07-13 2023-01-19 华为技术有限公司 Laser radar point cloud clustering method and apparatus, laser radar, and vehicle
CN116069051A (en) * 2021-10-29 2023-05-05 北京三快在线科技有限公司 Unmanned aerial vehicle control method, device, equipment and readable storage medium
CN116309689A (en) * 2023-05-17 2023-06-23 上海木蚁机器人科技有限公司 Obstacle track prediction method, device, equipment and medium
CN116299474A (en) * 2023-05-23 2023-06-23 禾多科技(北京)有限公司 Integrated radar device and vehicle obstacle avoidance method
CN114037736B (en) * 2021-11-15 2024-05-14 重庆大学 Vehicle detection and tracking method based on self-adaptive model

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676744A (en) * 2007-10-31 2010-03-24 北京航空航天大学 Method for tracking small target with high precision under complex background and low signal-to-noise ratio
CN104417562A (en) * 2013-08-29 2015-03-18 株式会社电装 Method and apparatus for recognizing road shape, program, and recording medium
KR20170074542A (en) * 2015-12-22 2017-06-30 부산대학교 산학협력단 Apparatus and Method for Building Indoor Map using Geometric Features and Probability Method
JP2018063236A (en) * 2016-10-13 2018-04-19 バイドゥ ネットコム サイエンス アンド テクノロジー(ペキン) カンパニー リミテッド Method and apparatus for annotating point cloud data
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
US20190086543A1 (en) * 2017-09-15 2019-03-21 Baidu Online Network Technology (Beijing) Co., Ltd. Method And Apparatus For Tracking Obstacle
CN109682388A (en) * 2018-12-21 2019-04-26 北京智行者科技有限公司 Follow the determination method in path
US20190155290A1 (en) * 2017-07-13 2019-05-23 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for trajectory determination
WO2019136479A1 (en) * 2018-01-08 2019-07-11 The Regents On The University Of California Surround vehicle tracking and motion prediction
CN110018489A (en) * 2019-04-25 2019-07-16 上海蔚来汽车有限公司 Target tracking method, device and controller and storage medium based on laser radar
CN110210389A (en) * 2019-05-31 2019-09-06 东南大学 A kind of multi-targets recognition tracking towards road traffic scene
CN110487183A (en) * 2019-08-27 2019-11-22 中国科学技术大学 A kind of multiple target fiber position accurate detection system and application method
US20190367020A1 (en) * 2018-05-31 2019-12-05 TuSimple System and method for proximate vehicle intention prediction for autonomous vehicles
CN110567324A (en) * 2019-09-04 2019-12-13 深圳市唯特视科技有限公司 multi-target group threat degree prediction device and method based on DS evidence theory
CN110658531A (en) * 2019-08-23 2020-01-07 畅加风行(苏州)智能科技有限公司 Dynamic target tracking method for port automatic driving vehicle

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676744A (en) * 2007-10-31 2010-03-24 北京航空航天大学 Method for tracking small target with high precision under complex background and low signal-to-noise ratio
CN104417562A (en) * 2013-08-29 2015-03-18 株式会社电装 Method and apparatus for recognizing road shape, program, and recording medium
KR20170074542A (en) * 2015-12-22 2017-06-30 부산대학교 산학협력단 Apparatus and Method for Building Indoor Map using Geometric Features and Probability Method
JP2018063236A (en) * 2016-10-13 2018-04-19 バイドゥ ネットコム サイエンス アンド テクノロジー(ペキン) カンパニー リミテッド Method and apparatus for annotating point cloud data
US20190155290A1 (en) * 2017-07-13 2019-05-23 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for trajectory determination
US20190086543A1 (en) * 2017-09-15 2019-03-21 Baidu Online Network Technology (Beijing) Co., Ltd. Method And Apparatus For Tracking Obstacle
WO2019136479A1 (en) * 2018-01-08 2019-07-11 The Regents On The University Of California Surround vehicle tracking and motion prediction
US20190367020A1 (en) * 2018-05-31 2019-12-05 TuSimple System and method for proximate vehicle intention prediction for autonomous vehicles
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
CN109682388A (en) * 2018-12-21 2019-04-26 北京智行者科技有限公司 Follow the determination method in path
CN110018489A (en) * 2019-04-25 2019-07-16 上海蔚来汽车有限公司 Target tracking method, device and controller and storage medium based on laser radar
CN110210389A (en) * 2019-05-31 2019-09-06 东南大学 A kind of multi-targets recognition tracking towards road traffic scene
CN110658531A (en) * 2019-08-23 2020-01-07 畅加风行(苏州)智能科技有限公司 Dynamic target tracking method for port automatic driving vehicle
CN110487183A (en) * 2019-08-27 2019-11-22 中国科学技术大学 A kind of multiple target fiber position accurate detection system and application method
CN110567324A (en) * 2019-09-04 2019-12-13 深圳市唯特视科技有限公司 multi-target group threat degree prediction device and method based on DS evidence theory

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
HONGZHUAN HU 等: "A micro-optical fiber positioner", 《ADVANCES IN OPTICAL AND MECHANICAL TECHNOLOGIES FOR TELESCOPES AND INSTRUMENTATION III》 *
MUHAMMAD SUALEH 等: "Dynamic Multi-LiDAR Based Multiple Object Detection and Tracking", 《SENSORS 2019》 *
YAN PENG 等: "The obstacle detection and obstacle avoidance algorithm based on 2-D lidar", 《2015 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION》 *
周增祥 等: "LAMOST焦面板球面测量误差分析", 《中国科学技术大学学报》 *
孙双 等: "基于多项式预测滤波的机动目标轨迹获取", 《南京师范大学学报(工程技术版)》 *
寇添 等: "基于机载激光雷达成像的动目标轨迹检测模型", 《激光与光电子学进展》 *
谢辉 等: "结构化道路中动态车辆的轨迹预测", 《汽车安全与节能学报》 *
邹斌 等: "基于三维激光雷达的动态障碍物检测和追踪方法", 《汽车技术》 *

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915657A (en) * 2020-07-08 2020-11-10 浙江大华技术股份有限公司 Point cloud registration method and device, electronic equipment and storage medium
CN111813120A (en) * 2020-07-10 2020-10-23 北京林业大学 Method and device for identifying moving target of robot and electronic equipment
CN111708021A (en) * 2020-07-15 2020-09-25 四川长虹电器股份有限公司 Personnel tracking and identifying algorithm based on millimeter wave radar
CN111860321A (en) * 2020-07-20 2020-10-30 浙江光珀智能科技有限公司 Obstacle identification method and system
CN111860321B (en) * 2020-07-20 2023-12-22 浙江光珀智能科技有限公司 Obstacle recognition method and system
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system
CN111929676B (en) * 2020-07-30 2022-07-08 上海交通大学 X-band radar target detection and tracking method based on density clustering
CN111929676A (en) * 2020-07-30 2020-11-13 上海交通大学 X-band radar target detection and tracking method based on density clustering
CN111866305A (en) * 2020-08-11 2020-10-30 普达迪泰(天津)智能装备科技有限公司 Image enhancement and environment adaptability method under indoor and outdoor specific conditions
CN111967374A (en) * 2020-08-14 2020-11-20 安徽海博智能科技有限责任公司 Mine obstacle identification method, system and equipment based on image processing
CN112346014A (en) * 2020-09-23 2021-02-09 宁波大学 Multi-base sonar positioning method based on signal arrival time difference
CN112346014B (en) * 2020-09-23 2022-06-21 宁波大学 Multi-base sonar positioning method based on signal arrival time difference
CN112505678A (en) * 2020-10-23 2021-03-16 中国第一汽车股份有限公司 Vehicle track calculation method and device, vehicle and medium
CN112562419B (en) * 2020-11-03 2022-04-08 南京航空航天大学 Off-line multi-target tracking-based weather avoidance zone setting method
CN112562419A (en) * 2020-11-03 2021-03-26 南京航空航天大学 Off-line multi-target tracking-based weather avoidance zone setting method
CN112560580A (en) * 2020-11-20 2021-03-26 腾讯科技(深圳)有限公司 Obstacle recognition method, device, system, storage medium and electronic equipment
TWI785421B (en) * 2020-11-24 2022-12-01 大云永續科技股份有限公司 Inventory management method and system
CN112379673A (en) * 2020-11-26 2021-02-19 广东盈峰智能环卫科技有限公司 Robot self-following method and device based on single-line laser radar and robot
CN112379673B (en) * 2020-11-26 2023-12-29 广东盈峰智能环卫科技有限公司 Robot self-following method and device based on single-line laser radar and robot
CN112666569A (en) * 2020-12-01 2021-04-16 天津优控智行科技有限公司 Compression method of laser radar continuous point cloud of unmanned system
CN112529874A (en) * 2020-12-14 2021-03-19 上海智蕙林医疗科技有限公司 Obstacle detection method, device, medium and robot based on three-dimensional radar
US11885638B2 (en) 2020-12-28 2024-01-30 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for generating a map for a robot
EP4024338A1 (en) * 2020-12-28 2022-07-06 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for generating a map for a robot
CN112733922A (en) * 2021-01-04 2021-04-30 上海高仙自动化科技发展有限公司 Method and device for determining forbidden area, robot and storage medium
CN112883909A (en) * 2021-03-16 2021-06-01 东软睿驰汽车技术(沈阳)有限公司 Surrounding box-based obstacle position detection method and device and electronic equipment
CN113074748B (en) * 2021-03-29 2022-08-26 北京三快在线科技有限公司 Path planning method and device for unmanned equipment
CN113074748A (en) * 2021-03-29 2021-07-06 北京三快在线科技有限公司 Path planning method and device for unmanned equipment
CN113111887A (en) * 2021-04-26 2021-07-13 河海大学常州校区 Semantic segmentation method and system based on information fusion of camera and laser radar
CN113111887B (en) * 2021-04-26 2022-04-15 河海大学常州校区 Semantic segmentation method and system based on information fusion of camera and laser radar
WO2022243991A1 (en) * 2021-05-03 2022-11-24 Israel Aerospace Industries Ltd. System and method for tracking and classifying moving objects
CN113111973A (en) * 2021-05-10 2021-07-13 北京华捷艾米科技有限公司 Depth camera-based dynamic scene processing method and device
CN113191459A (en) * 2021-05-27 2021-07-30 山东高速建设管理集团有限公司 Road-side laser radar-based in-transit target classification method
CN113269889A (en) * 2021-06-04 2021-08-17 重庆大学 Self-adaptive point cloud target clustering method based on elliptical domain
CN113269889B (en) * 2021-06-04 2023-04-07 重庆大学 Self-adaptive point cloud target clustering method based on elliptical domain
CN113183967A (en) * 2021-06-04 2021-07-30 多伦科技股份有限公司 Vehicle safety control method, device, equipment and storage medium
CN113313654B (en) * 2021-06-23 2023-12-15 上海西井科技股份有限公司 Laser point cloud filtering denoising method, system, equipment and storage medium
CN113536959A (en) * 2021-06-23 2021-10-22 复旦大学 Dynamic obstacle detection method based on stereoscopic vision
CN113313654A (en) * 2021-06-23 2021-08-27 上海西井信息科技有限公司 Laser point cloud filtering and denoising method, system, equipment and storage medium
CN113447953A (en) * 2021-06-29 2021-09-28 山东高速建设管理集团有限公司 Background filtering method based on road traffic point cloud data
WO2023284705A1 (en) * 2021-07-13 2023-01-19 华为技术有限公司 Laser radar point cloud clustering method and apparatus, laser radar, and vehicle
CN113759900B (en) * 2021-08-12 2023-05-02 中南大学 Method and system for track planning and real-time obstacle avoidance of inspection robot based on obstacle region prediction
CN113759900A (en) * 2021-08-12 2021-12-07 中南大学 Inspection robot track planning and real-time obstacle avoidance method and system based on obstacle area prediction
CN113807239B (en) * 2021-09-15 2023-12-08 京东鲲鹏(江苏)科技有限公司 Point cloud data processing method and device, storage medium and electronic equipment
CN113807239A (en) * 2021-09-15 2021-12-17 京东鲲鹏(江苏)科技有限公司 Point cloud data processing method and device, storage medium and electronic equipment
CN116069051A (en) * 2021-10-29 2023-05-05 北京三快在线科技有限公司 Unmanned aerial vehicle control method, device, equipment and readable storage medium
CN116069051B (en) * 2021-10-29 2024-03-19 北京三快在线科技有限公司 Unmanned aerial vehicle control method, device, equipment and readable storage medium
CN114266796A (en) * 2021-11-05 2022-04-01 广东省国土资源测绘院 Method, device, medium and product for automatically acquiring natural shoreline based on laser point cloud data and average large-tide high-tide surface
CN114037736B (en) * 2021-11-15 2024-05-14 重庆大学 Vehicle detection and tracking method based on self-adaptive model
CN114037736A (en) * 2021-11-15 2022-02-11 重庆大学 Vehicle detection and tracking method based on self-adaptive model
CN113926174A (en) * 2021-11-16 2022-01-14 南京禾蕴信息科技有限公司 Pile-winding motion trajectory analysis and timing device and analysis method thereof
CN114460582B (en) * 2021-12-14 2023-04-14 江苏航天大为科技股份有限公司 Millimeter wave radar cart identification method based on point cloud speed
CN114460582A (en) * 2021-12-14 2022-05-10 江苏航天大为科技股份有限公司 Millimeter wave radar cart identification method based on point cloud speed
US11720110B2 (en) 2022-03-23 2023-08-08 Anhui University Dynamic obstacle avoidance method based on real-time local grid map construction
CN114384920A (en) * 2022-03-23 2022-04-22 安徽大学 Dynamic obstacle avoidance method based on real-time construction of local grid map
WO2023179718A1 (en) * 2022-03-24 2023-09-28 北京一径科技有限公司 Point cloud processing method and apparatus for lidar, and device and storage medium
CN114384492B (en) * 2022-03-24 2022-06-24 北京一径科技有限公司 Point cloud processing method and device for laser radar and storage medium
CN114384491B (en) * 2022-03-24 2022-07-12 北京一径科技有限公司 Point cloud processing method and device for laser radar and storage medium
CN114384492A (en) * 2022-03-24 2022-04-22 北京一径科技有限公司 Point cloud processing method and device for laser radar and storage medium
CN114384491A (en) * 2022-03-24 2022-04-22 北京一径科技有限公司 Point cloud processing method and device for laser radar and storage medium
WO2023179717A1 (en) * 2022-03-24 2023-09-28 北京一径科技有限公司 Point cloud processing method and apparatus for laser radar, device, and storage medium
CN114973564A (en) * 2022-04-28 2022-08-30 北京机械设备研究所 Remote personnel intrusion detection method and device under non-illumination condition
CN114743181A (en) * 2022-04-29 2022-07-12 重庆长安汽车股份有限公司 Road vehicle target detection method and system, electronic device and storage medium
CN115561772B (en) * 2022-08-26 2023-08-29 东莞理工学院 Laser radar driving environment cognition system based on visual area guidance
CN115561772A (en) * 2022-08-26 2023-01-03 东莞理工学院 Laser radar driving environment cognitive system based on visual area guidance
CN116309689A (en) * 2023-05-17 2023-06-23 上海木蚁机器人科技有限公司 Obstacle track prediction method, device, equipment and medium
CN116309689B (en) * 2023-05-17 2023-07-28 上海木蚁机器人科技有限公司 Obstacle track prediction method, device, equipment and medium
CN116299474B (en) * 2023-05-23 2023-09-12 禾多科技(北京)有限公司 Integrated radar device and vehicle obstacle avoidance method
CN116299474A (en) * 2023-05-23 2023-06-23 禾多科技(北京)有限公司 Integrated radar device and vehicle obstacle avoidance method

Also Published As

Publication number Publication date
CN111337941B (en) 2022-03-04

Similar Documents

Publication Publication Date Title
CN111337941B (en) Dynamic obstacle tracking method based on sparse laser radar data
CN112014857B (en) Three-dimensional laser radar positioning and navigation method for intelligent inspection and inspection robot
CN109684921B (en) Road boundary detection and tracking method based on three-dimensional laser radar
CN111583369B (en) Laser SLAM method based on facial line angular point feature extraction
CN110163904A (en) Object marking method, control method for movement, device, equipment and storage medium
WO2022188663A1 (en) Target detection method and apparatus
CN114842438A (en) Terrain detection method, system and readable storage medium for autonomous driving vehicle
CN112184736B (en) Multi-plane extraction method based on European clustering
CN112836633A (en) Parking space detection method and parking space detection system
CN110197173B (en) Road edge detection method based on binocular vision
Zhang et al. Multiple vehicle-like target tracking based on the velodyne lidar
Gosala et al. Redundant perception and state estimation for reliable autonomous racing
Meng et al. Loop-closure detection with a multiresolution point cloud histogram mode in lidar odometry and mapping for intelligent vehicles
CN115861968A (en) Dynamic obstacle removing method based on real-time point cloud data
Rasmussen et al. On-vehicle and aerial texture analysis for vision-based desert road following
CN115908539A (en) Target volume automatic measurement method and device and storage medium
CN114549549B (en) Dynamic target modeling tracking method based on instance segmentation in dynamic environment
Xiong et al. Road-Model-Based road boundary extraction for high definition map via LIDAR
CN114545435A (en) Dynamic target sensing system and method fusing camera and laser radar
Gate et al. Using targets appearance to improve pedestrian classification with a laser scanner
CN113554705B (en) Laser radar robust positioning method under changing scene
Wang et al. A robust submap-based road shape estimation via iterative Gaussian process regression
CN111239761B (en) Method for indoor real-time establishment of two-dimensional map
CN115373383A (en) Autonomous obstacle avoidance method and device for garbage recovery unmanned boat and related equipment
Mattson et al. Reducing ego vehicle energy-use by LiDAR-based lane-level positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant