CN107705563B - Laser radar-based continuous vehicle speed detection method - Google Patents

Laser radar-based continuous vehicle speed detection method Download PDF

Info

Publication number
CN107705563B
CN107705563B CN201711219157.3A CN201711219157A CN107705563B CN 107705563 B CN107705563 B CN 107705563B CN 201711219157 A CN201711219157 A CN 201711219157A CN 107705563 B CN107705563 B CN 107705563B
Authority
CN
China
Prior art keywords
vehicle
data
index
frame
vehicles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711219157.3A
Other languages
Chinese (zh)
Other versions
CN107705563A (en
Inventor
郑建颖
徐斌
王翔
徐浩
范学良
陶砚蕴
陈蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou University
Original Assignee
Suzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou University filed Critical Suzhou University
Priority to CN201711219157.3A priority Critical patent/CN107705563B/en
Publication of CN107705563A publication Critical patent/CN107705563A/en
Application granted granted Critical
Publication of CN107705563B publication Critical patent/CN107705563B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention relates to a continuous vehicle speed detection method based on a laser radar, which relates to the laser radar speed measurement technology and the vehicle speed detection technology field and solves the problems that the classification of vehicles and non-vehicle objects cannot be realized by adopting 2D laser radar detection and the cost of a 64-line 3D laser radar with high cost in the prior art of adopting the laser radar to realize the vehicle speed detection.

Description

Laser radar-based continuous vehicle speed detection method
Technical Field
The invention relates to the technical field of laser radar speed measurement and vehicle speed detection.
Background
According to the 2010 statistical data of the world health organization WHO, the number of people dying due to road traffic accidents is about 125 thousands of people every year in the world, and 2000 to 5000 thousands of people are injured. The road traffic safety problem is increasingly serious, and the effect of accurately extracting the road information on improving the road traffic safety is obvious.
The vehicle is used as an important component of road information, the vehicle characteristics are accurately extracted, and the method is a precondition for solving the problem of road traffic safety. In recent years, with the development of laser radar technology, a new means is provided for vehicle detection. Through the laser radar 3D point cloud data, a road traffic three-dimensional scene can be obtained, and accurate detection and classification are carried out on traffic participating objects.
The current detection method of the vehicle mainly comprises video detection, annular coil detection, microwave radar detection, infrared detection, magnetic sensor detection and the like, wherein is widely applied to commerce in a video detection mode, the detection range of the video detection mode is large, various traffic events can be detected and the like, however, the video detection is very sensitive to light, the video detection error is increased due to sudden change of shadow and visible light, and the video detection mode lacks distance information.
Currently, there is a lot of work on road traffic analysis, wherein the main problems are detection and trajectory tracking of moving objects on the road. At present, different sensing modes and different detection methods are adopted for road track tracking. Among them, there are mainly vision-based road vehicle detection methods and distance-based road vehicle detection methods currently used, wherein the vision-based road vehicle detection methods are susceptible to interference of visible light and shadow, different shapes, sizes and colors of vehicles, and different moving directions of vehicles on multiple lanes, so that vehicle attitude estimation becomes very challenging; road vehicle detection and trajectory tracking based on distance are different according to the sensor of choosing for use, such as laser radar, microwave radar etc..
For a dynamic detection scene of a laser radar, a SLAMMOT algorithm is adopted for vehicle tracking, and the earliest work can be traced back to 2007. for continuous traffic data acquisition work, an ITS research group of Minnesota university develops traffic information acquisition systems based on a laser radar network and used for acquiring road information of a traffic intersection, the laser radar is used for vehicle detection and track tracking, and a 2D laser radar is adopted for vehicle detection and tracking on horizontal planes.
These research works are based on 64-line laser radar Velodyne HDL-64E LIDAR, and the vehicle detection is performed in dynamic environments, and only the vehicle behaviors in a small range around the automatic driving vehicle are researched.
Disclosure of Invention
The invention solves the problems that the 2D laser radar detection cannot classify the vehicle and non-vehicle objects and the 3D laser radar with 64 lines of high cost is too high in the existing technology for realizing the vehicle speed detection by adopting the laser radar.
The invention relates to a laser radar-based continuous vehicle speed detection method, which adopts a 16-line laser radar to continuously collect road surface scene data of a traffic lane, and then obtains the average speed of vehicles in current frame data according to collected K frame data, and the specific process is as follows:
1) and calculating to obtain a distance matrix between all vehicles of the adjacent frame data according to the two adjacent frames of data:
Figure GDA0002239648270000021
wherein, the k frame data has mkVehicle, SijRepresents the distance between the class center of the ith vehicle in the k frame data and the class center of the jth vehicle in the k-1 frame data, i ∈ (0, m)k],j∈(0,mk-1],
Figure GDA0002239648270000031
Wherein,
Figure GDA0002239648270000032
an abscissa indicating the class center of the ith vehicle in the kth frame data,
Figure GDA0002239648270000033
the abscissa indicating the class center of the jth vehicle in the kth-1 frame data,
Figure GDA0002239648270000034
a vertical coordinate representing the class center of the ith vehicle in the kth frame data,
Figure GDA0002239648270000035
a vertical coordinate representing a class center of a jth vehicle in the kth-1 frame data;
k is 2, 3 … K, and obtaining a distance matrix between all vehicles of a plurality of adjacent frame data;
2) and a vehicle incidence matrix obtained according to the distance matrix:
according to the obtained distance matrix of the vehicles of all the adjacent two frames of data, obtaining the vehicle related data of the corresponding adjacent two frames of data, summarizing the vehicle related data of all the adjacent two frames of data to obtain a vehicle related matrix, wherein the vehicle related matrix is as follows:
Figure GDA0002239648270000036
wherein, akjThe number of vehicles in the data frame containing the most vehicles in the K frame data of the J table, J ∈ (0, J)](ii) a Number m of vehicles in the kth frame datakWhen J is less than J, the following are:
Figure GDA0002239648270000037
are all 0;
3) traversing all vehicles in the kth frame data to obtain inspected vehiclesMeasured set of vehicles P ═ { V ═ Vid=1,Vid=2,…,Vid=countThe specific process is as follows:
judging a corresponding to the ith vehiclekiWhether or not it is 0 or not,
if akiIf the attribute id of the ith vehicle is 0, judging whether the attribute id of the ith vehicle is 0, if so, establishing a vehicle id which is equal to the id of the ith vehicle and is equal to the count +1, updating the count which is equal to the count +1, and establishing Vid=countAn object and adding it to the set of detected vehicles P ═ { V ═ Vkl=1,Vkl=2…,Vid=countIn (1) }; otherwise, the attribute id of the ith vehicle is unchanged;
if akiIf the association is not 0, associating the ith vehicle with the vehicle in the detected vehicle set, if the association is successful, corresponding the id of the vehicle to the id of the vehicle which is successfully associated, adding the id into the vehicle set P' which is successfully associated, if the association is failed, establishing a new vehicle id for the ith vehicle, wherein the id is count +1, updating the count which is count +1, and creating an object Vid=countAnd add the object to the set of detected vehicles P ═ { V ═ Vid=1,Vid=2,…,Vid=countIn (1) };
after traversing all vehicles in the k-th frame data, a detected vehicle set P ═ { V ═ is obtainedid=1,Vid=2,…,Vid=countV ' and a successfully associated set of vehicles P ' ═ V 'id=1,V′id=2,…,V′id=count′},count′≤count;
4) Calculating the average speed of the vehicle for each vehicle in the successfully correlated vehicle set
Figure GDA0002239648270000041
Wherein k isindex′,sNumber of data frame, k, indicating the first appearance of vehicle with vehicle id indexindex′,eA sequence number of a data frame indicating that the vehicle whose vehicle id is index' has appeared for the last times,
Figure GDA0002239648270000042
the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,eThe x-axis coordinate in the frame data,
Figure GDA0002239648270000043
the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,sThe x-axis coordinate in the frame data,
Figure GDA0002239648270000044
the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,eThe y-axis coordinate in the frame data,
Figure GDA0002239648270000045
the characteristic point of the vehicle with the vehicle id of index' is at the k-thindex′,sY-axis coordinate, t, in frame dataindex′,sIs the k-thindex′,sTime stamp of frame data frame, tindex′,eIs the k-thindex′,eTime stamp of frame data frame.
The invention adopts 16-line laser radar to realize the detection of the continuous vehicle speed, thereby providing an effective means for realizing the tracking of the vehicle track.
The 16-line laser radar adopted by the invention has relatively low cost and abundant 3D point cloud information.
When the vehicle speed is measured by using the adjacent frame data, when the number of vehicles in the two adjacent frames of data is not communicated, the speed measurement is affected, for example, there are m vehicles in the current frame data, and there are n vehicles in the front frame data, and there are three situations according to the magnitude relationship between m and n:
in case 1, when the number m of vehicles in the current frame is less than the number n of vehicles in the previous frames, it indicates that vehicles leave the detection area or disappear due to vehicle occlusion and the like, at this time, only the vehicles less than or equal to the number of vehicles in the current frame can be associated successfully, and the vehicles which are not associated by the current frame in the previous frame can only be associated through the subsequent frames.
Case 2, when the number m of vehicles in the current frame is equal to the number n of vehicles in the previous frames, the two frames of vehicles are most likely to be successfully associated with each other.
Case 3, when the number m of vehicles in the current frame is greater than the number n of vehicles in the previous frames, it indicates that vehicles newly enter or vehicles which disappear before appear, then the association failure of vehicles in the current frame occurs,
based on the above analysis, due to the fact that the vehicle is blocked and the detection distance is long, the vehicle is randomly lost in certain frame data, the vehicle association obtained by using adjacent frame data fails, and further speed detection cannot be achieved.
The invention adopts a vehicle association method based on multi-frame data, further overcomes the defects existing in the adoption of adjacent frame data, can completely avoid the influence on the measurement due to frame loss, and also completely overcomes the influence on the measurement due to mutual shielding of vehicles.
In traffic engineering, the continuous speed change curve of the vehicle has important significance, the continuous speed change rule of the vehicle is obtained by starting from the laser radar 3D point cloud data and detecting and correlating the vehicle, and the method has important significance for further analyzing the vehicle behavior in step .
Drawings
FIG. 1 is a two-way four lane scenario as detected by the Velodyne LiDAR VLP-16 LiDAR of embodiment .
FIG. 2 shows a Box model (Box model of a vehicle) of vehicles according to the embodiment.
FIG. 3 illustrates an embodiment of obtaining a Model of a real vehicle () from a box Model based on radar detection.
Fig. 4 shows the vehicle configuration (differential status of a vehicle by relative position configuration) detected by the laser radar at Different positions in the embodiment.
Fig. 5 to 8 are graphs illustrating the results of the speed profile analysis for individual vehicle samples in four lanes as described in test experiment , respectively.
Fig. 9 is the result of the vehicle of lane shown in fig. 5 after 237 frames of original image and grid filtering.
Fig. 10 is a schematic position diagram of a vehicle tail point P obtained by the vehicle real tail point algorithm found from the lidar 0 ° position in test experiment two.
Fig. 11 is a diagram illustrating the result of the vehicle speed correction in the th lane according to the second test experiment.
Fig. 12 is a graph showing the result of the vehicle speed correction in the second lane according to the second test experiment.
Fig. 13 shows the results of the test in the test experiment four, in which 1 to 2500 frames of data indicate the detection distances of the vehicle just entering and last leaving, and evaluation indexes 2 to 3 of the results are shown in fig. 14.
FIG. 15 is a graph obtained by the fourth test experiment, wherein the 2500-. In the figure, the Left first detection radar Left side detects the vehicle for the first time, and the Right first detection indicates that the radar Right side detects the vehicle for the first time.
Detailed Description
Embodiment , the method for detecting a continuous vehicle speed based on a laser radar in this embodiment continuously collects road surface scene data of a traffic lane by using a 16-line laser radar, and then obtains an average speed of a vehicle in current frame data according to the collected K frame data, which specifically comprises the following steps:
1) and calculating to obtain a distance matrix between all vehicles of the adjacent frame data according to the two adjacent frames of data:
wherein, the k frame data has mkVehicle, SijIndicating a distance between the class center of the i-th vehicle in the k-th frame data and the class center of the j-th vehicle in the k-1 th frame data,i∈(0,mk],j∈(0,mk-1],
Figure GDA0002239648270000052
wherein,
Figure GDA0002239648270000061
an abscissa indicating the class center of the ith vehicle in the kth frame data,the abscissa indicating the class center of the jth vehicle in the kth-1 frame data,
Figure GDA0002239648270000063
a vertical coordinate representing the class center of the ith vehicle in the kth frame data,
Figure GDA0002239648270000064
a vertical coordinate representing a class center of a jth vehicle in the kth-1 frame data;
k is 2, 3 … K, and obtaining a distance matrix between all vehicles of a plurality of adjacent frame data;
2) and a vehicle incidence matrix obtained according to the distance matrix:
according to the obtained distance matrix of the vehicles of all the adjacent two frames of data, obtaining the vehicle related data of the corresponding adjacent two frames of data, summarizing the vehicle related data of all the adjacent two frames of data to obtain a vehicle related matrix, wherein the vehicle related matrix is as follows:
Figure GDA0002239648270000065
wherein, akjThe number of vehicles in the data frame containing the most vehicles in the K frame data of the J table, J ∈ (0, J)](ii) a Number m of vehicles in the kth frame datakWhen J is less than J, the following are:are all 0;
3) traversing all vehicles in the kth frame data to obtain a detected vehicle set P ═ Vid=1,Vid=2,…,Vid=countThe specific process is as follows:
judging a corresponding to the ith vehiclekiWhether or not it is 0 or not,
if akiIf the attribute id of the ith vehicle is 0, judging whether the attribute id of the ith vehicle is 0, if so, establishing a vehicle id which is equal to the id of the ith vehicle and is equal to the count +1, updating the count which is equal to the count +1, and establishing Vid=countAn object and adding it to the set of detected vehicles P ═ { V ═ Vkl=1,Vkl=2…,Vid=countIn (1) }; otherwise, the attribute id of the ith vehicle is unchanged;
if akiIf the association is not 0, associating the ith vehicle with the vehicle in the detected vehicle set, if the association is successful, corresponding the id of the vehicle to the id of the vehicle which is successfully associated, adding the id into the vehicle set P' which is successfully associated, if the association is failed, establishing a new vehicle id for the ith vehicle, wherein the id is count +1, updating the count which is count +1, and creating an object Vid=countAnd add the object to the set of detected vehicles P ═ { V ═ Vid=1,Vid=2,…,Vid=countIn (1) };
after traversing all vehicles in the k-th frame data, a detected vehicle set P ═ { V ═ is obtainedid=1,Vid=2,…,Vid=countV ' and a successfully associated set of vehicles P ' ═ V 'id=1,V′id=2,…,V′id=count′},count′≤count;
4) Calculating the average speed of the vehicle for each vehicle in the successfully correlated vehicle set
Figure GDA0002239648270000067
Wherein k isindex′,sNumber of data frame, k, indicating the first appearance of vehicle with vehicle id indexindex′,eA sequence number of a data frame indicating that the vehicle whose vehicle id is index' has appeared for the last times,
Figure GDA0002239648270000071
the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,eThe x-axis coordinate in the frame data,
Figure GDA0002239648270000072
the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,sThe x-axis coordinate in the frame data,
Figure GDA0002239648270000073
the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,eThe y-axis coordinate in the frame data,
Figure GDA0002239648270000074
the characteristic point of the vehicle with the vehicle id of index' is at the k-thindex′,sY-axis coordinate, t, in frame dataindex′,sIs the k-thindex′,sTime stamp of frame data frame, tindex′,eIs the k-thindex′,eIn the context of lidar vehicle detection, vehicles must be correlated due to the different class numbers of the same vehicles clustered at different data frames in order to obtain the speed of the same vehicles at different data frames.
In such a detection mode, the vehicle association needs to solve the following problems:
1) due to the fact that the vehicles are completely shielded, the vehicles on the original lane completely disappear, and association failure of two adjacent frames is caused;
2) due to the fact that the vehicle is partially shielded, the vehicle form detected by the original lane is irregular, and the geometric characteristics of the vehicle are inconsistent ;
3) when the small-sized vehicle runs on a far lane, the laser radar detects few effective points, so that the difference between the appearance shape of the corresponding vehicle and the appearance of the real vehicle is large.
In view of the above-mentioned practical problems, the present embodiment adopts a multi-frame vehicle association method based on adjacent frame association, which completely solves the above-mentioned three problems.
The method for obtaining the vehicle related data of the two adjacent frames of data in the step 2) comprises the following steps:
sequentially selecting the minimum distance S of each row according to the distance matrix obtained in the step 1)iaThe minimum distance is the distance between the a-th vehicle in the k-1 th frame data and the i-th vehicle in the k-th frame data, and then step judges the following two conditions:
condition 1: sia≤Tmax_xy_moveWherein T ismax_xy_moveIs the maximum distance vehicles can travel between adjacent frames, which is determined by the maximum speed defined for the road segment;
condition 2: y iso(i)-yo(a)≤Tmax_y_move,Tmax_y_moveRepresenting the maximum offset of vehicles on the vertical axis between adjacent frames, the offset representing the offset of the vehicle from the direction of travel and actually representing the lane information of the vehicle, and selecting Tmax_y_moveThe advantages of the parameters are as follows: the vehicles in other lanes can be prevented from being matched with the current lane, and the association failure of the lane-changing vehicles can also be prevented;
if both conditions are met, the association is considered successful.
In the step 3), the method for associating the ith vehicle with the vehicle in the detected vehicle set includes:
judgment condition 1: whether the associated vehicle is located at or near the lane in which the vehicle to be associated is located, i.e. when the condition is met
Figure GDA0002239648270000084
Then the associated vehicle is considered to be in or near the lane where the vehicle to be associated is located, wherein yo,kiIndicating vehicles to be associatedThe class center longitudinal coordinate of the vehicle, wherein the vehicle to be associated is the ith vehicle in the kth frame data;
Figure GDA0002239648270000085
for the class center ordinate of the associated vehicle in the data frame of its last occurrences, T being the vehicle with vehicle id index in the set of detected vehiclesmax_xy_moveRepresents the maximum offset of the vertical axis between adjacent frames for vehicles;
judgment condition 2: whether the associated vehicle appears in a data frame that is closer before a data frame in which the vehicle to be associated appears, when the condition is satisfied
Figure GDA0002239648270000086
Then the associated vehicle is considered to appear in the data frame that is closer before the data frame that the vehicle to be associated appears, where KkiThe serial number of the data frame where the vehicle to be turned off is located,
Figure GDA0002239648270000087
number of data frames, T, representing the last occurrences of the associated vehiclemax_frame_loseRepresenting the maximum number of lost frames allowed;
judgment condition 3: whether the associated vehicle appears before the appearance position of the vehicle to be associated or not is firstly obtained:
Figure GDA0002239648270000081
wherein, v _ direction represents the vehicle motion direction, when the value is +1, the vehicle motion direction is positive along the horizontal axis, when the value is-1, the vehicle motion direction is negative along the horizontal axis, Tv_change_laneIndicating the distance of the lane at which the direction of movement of the vehicle begins to change, yO,kiRepresenting the class center ordinate of the vehicle to be associated, the position relationship between the vehicle to be associated and the associated vehicle should satisfy the condition:
Figure GDA0002239648270000082
wherein x isO,kiAn abscissa representing the class center of the vehicle to be associated,
Figure GDA0002239648270000083
the abscissa of the center of its class in the last occurrences of the associated vehicle in the data frame;
if the three conditions are met simultaneously, the vehicle association is preliminarily judged to be successful;
then, for each associated vehicle and the vehicle to be associated which are successfully associated by the preliminary determination, calculating the distance difference of the positions of the associated vehicle and the vehicle to be associated at the current frame, and then selecting the associated vehicle and the vehicle to be associated which correspond to the minimum value in the distance difference.
The process of calculating and obtaining the distance difference between the current frame and the current frame is as follows:
according to the average speed V of the associated vehicleinde x′Estimating the position of the vehicle in the current frame as follows:
Sindex′=vindex′(tki-tindex′,e),
k is the data frame number of the current frame, tkiIs the time stamp, t, of the current frameindex′,eA timestamp of the last occurrences of the data frame for the associated vehicle;
the distance between the vehicle to be associated and the associated vehicle is:
wherein x isForE,kiRepresenting the x-axis coordinate, y, of the combined feature points of the vehicles to be associated in the current frame dataForE,kiRepresenting the y-axis coordinate of the characteristic point ForE of the vehicle to be associated in the current frame data;
obtaining the distance difference | S between the associated vehicle and the position of the vehicle to be associated at the current frameindex′,ki-Sindex′|。
The process of successfully associating the associated vehicle corresponding to the minimum value in the selected distance difference with the vehicle to be associated is as follows:
taking the minimum value of the differences between the distances of the vehicle to be associated and all associated vehicles:
ΔSmin_index′,ki=arg min|Sindex′,ki-Sindex′|
min _ index' is the vehicle id with the smallest difference between the distances from all associated vehicles to the vehicle to be associated, when the minimum value satisfies the condition Δ Smin_index′,ki≤Tmin_ΔSIf so, judging that the association is successful, and assigning the id of the vehicle to be associated as min _ index' and Tmin_ΔSIndicating the allowed error value for the distance estimate.
In the present embodiment, when calculating the vehicle distance, the vehicle-related data in the data obtained by the laser radar is expressed by using a vehicle box model, as shown in fig. 2, A, B, C, D in the vehicle box model respectively represents the top point of the vehicle, and point O is the center point of the vehicle, which is characteristic points of the vehicle, also called the class center of the vehicle 1 in the figure1、12、13、14Respectively, the distance between the corresponding two points, v the speed at which the vehicle travels, (x)center,ycenter) Coordinates representing the vehicle center point O, (x)min,ymin) The coordinates of the vehicle vertex a are indicated, (xmax, ymin) the coordinates of the vehicle vertex B. The points E and F are two characteristic points of the vehicle box, are real points on the vehicle detected by the laser radar and are provided with time stamp information, the points E and F are obtained by respectively selecting the N points nearest to the point A, B and taking the average value of the x-axis coordinate and the y-axis coordinate of the N points, and the time stamp information is the average value of the time stamps of the N points nearest to the point. Fig. 3 is a vehicle box model of a real vehicle acquired by a lidar.
The vehicle box model also has a combined characteristic point ForE, which is obtained by combining the E point and the F point of the vehicle according to the relative positions of the vehicle and the laser radar. The vehicle is detected in different forms due to different relative positions of the vehicle and the laser radar. Referring to FIG. 4:
s1 shows that the vehicle is at the left side of the laser radar, and only A, B and C points of the vehicle can be detected;
s2 shows that the vehicle is facing the laser radar and only points A and B are detected;
s3 indicates that the vehicle is to the right of the lidar and that only point A, B and D were detected. While the detected AB segment cannot represent the true vehicle length as the vehicle moves.
In summary, in order to prevent the inaccuracy of the calculation of the vehicle speed due to the large deviation of the calculated distance caused by the selected feature point not corresponding to the same position of the vehicle, the feature point, i.e., the combined feature point ForE, should be dynamically selected according to the positions of the vehicle and the laser radar.
Test experiment analysis of individual vehicle sample speed profiles for four lanes:
the data frames of the single vehicles in the four lanes are respectively selected to continuously pass through, and referring to fig. 5-8, the speed curves of the single vehicles on the th lane, the second lane, the third lane and the fourth lane are respectively shown.
Test experiment second, th lane and second lane error analysis:
the result of selecting the th lane vehicle after 237 frames of raw image and grid filtering is shown in fig. 9, #237 is at lidar 0 ° position where is the start and end of lidar frame data, this results in the detected vehicle minimum point, which is actually the point where #236 scanned the vehicle in a very short time, and not the real tail point.
And searching the real tail point of the vehicle according to the 0-degree position of the laser radar, wherein the algorithm is described as follows:
r1: judging whether the laser radar is in the opposite position of the laser radar
R2: finding out a point set C where a real tail point is located according to a laser radar data horizontal angle (azimuth)
R3: find x _ min _ revise in C point set
R4: finding the nearest N in the vicinity of the point to be 5 points, calculating the average of x, y and t as the real vehicle tail point x _ min _ revisise _ ave of the vehicle
R5: projecting the point onto the long side of the car, i.e. (x _ min _ review _ ave, y _ min)
R6: using the point to calculate the speed of the vehicle
The real vehicle tail point algorithm result found according to the laser radar 0 degree position is shown in fig. 10, the point P is the real vehicle tail point, the speed of frames of the vehicle passing through the 0 degree opposite position has a minimum value point, because the vehicle tail point obtained by the vehicle at the 0 degree opposite position is not real, after the vehicle tail point is corrected to be the real vehicle tail point, the time difference with the next frames is too small, the same problem as the above error occurs, the processing method at the moment is to skip the laser radar 0 degree opposite position, and the vehicle speed is calculated by using the two frames before and after the frame (the time interval of the two frames is about 0.1 s).
As shown in the speed curve diagram 11 of the vehicle in the th lane after the speed correction and the speed curve of the vehicle in the second lane, referring to fig. 12, the speed of the original abnormal frame is corrected and the speed curve becomes smooth after the speed correction.
And (3) a third test experiment: and analyzing the continuous speed of the vehicle and analyzing errors.
The key to obtaining continuous speed of a vehicle is the accuracy of vehicle detection and vehicle association. When algorithm analysis is carried out, the degree of fitting of the algorithm to the frames 1-2500 is high when the algorithm is designed; and the latter 2500 + 5456 is used as a verification set for testing the working performance of the algorithm. When the vehicle correlation accuracy is analyzed, the following evaluation indexes are selected for performance evaluation:
1) the number of the actual vehicles passing through the sample set of Vehicle number Vehicle _ count is determined by , and if the number of the vehicles passing through the algorithm is different from the actual number, the error of the Vehicle association is indicated.
2) The maximum detection distance DR. is a fixed distance between the vehicle entering and leaving the sample set, and if the vehicle association is successful, the vehicle enters the detection zone from the leftmost side and leaves the detection zone from the rightmost side when the vehicle moves in the positive x-axis direction.
3) The number of Frames from the beginning of the vehicle entering the inspection area to the time of leaving the inspection area is related to the speed, and when the speed is not within a wide range, the number of Frames from which the vehicle is detected is also in a fixed interval of .
Tests were performed on two separate data sets and the vehicle number results accuracy comparisons are shown in table 1. The test results of the detection distances of the vehicle just starting to enter and the vehicle finally leaving for 1 to 2500 frames are shown in fig. 13, and the results of the evaluation indexes 2 to 3 thereof are shown in fig. 14. The 2500-.
In the following table: real represents the Real distance of the vehicle, Detection represents the distance measured by the method, Error Ratio represents the Error, and Accuracy represents the precision.
TABLE 1
Real Detection Error Ratio Accuracy
1-2500 139 139 0.00% 100.00%
2500-5456 157 159 1.27% 98.73%

Claims (7)

1. The method for detecting the continuous vehicle speed based on the laser radar is characterized in that the method adopts the 16-line laser radar to continuously collect road surface scene data of a traffic lane, and then obtains the average speed of the vehicle in the current frame data according to the collected K frame data, and the specific process is as follows:
1) and calculating to obtain a distance matrix between all vehicles of the adjacent frame data according to the two adjacent frames of data:
Figure FDA0002239648260000011
wherein, the k frame data has mkVehicle, SijRepresents the distance between the class center of the ith vehicle in the k frame data and the class center of the jth vehicle in the k-1 frame data, i ∈ (0, m)k],j∈(0,mk-1],
Wherein,
Figure FDA0002239648260000013
an abscissa indicating the class center of the ith vehicle in the kth frame data,
Figure FDA0002239648260000014
the abscissa indicating the class center of the jth vehicle in the kth-1 frame data,
Figure FDA0002239648260000015
a vertical coordinate representing the class center of the ith vehicle in the kth frame data,
Figure FDA0002239648260000016
a vertical coordinate representing a class center of a jth vehicle in the kth-1 frame data;
k is 2, 3 … K, and obtaining a distance matrix between all vehicles of a plurality of adjacent frame data;
2) and a vehicle incidence matrix obtained according to the distance matrix:
according to the obtained distance matrix of the vehicles of all the adjacent two frames of data, obtaining the vehicle related data of the corresponding adjacent two frames of data, summarizing the vehicle related data of all the adjacent two frames of data to obtain a vehicle related matrix, wherein the vehicle related matrix is as follows:
Figure FDA0002239648260000017
wherein, akjThe number of vehicles in the data frame containing the most vehicles in the K frame data of the J table, J ∈ (0, J)](ii) a Number m of vehicles in the kth frame datakWhen J is less than J, the following are:
Figure FDA0002239648260000018
are all 0;
3) traverse the k frame dataAll vehicles in the vehicle cluster, and obtaining the detected vehicle cluster P ═ { V ═ Vid=1,Vid=2,…,Vid=countThe specific process is as follows:
judging a corresponding to the ith vehiclekiWhether or not it is 0 or not,
if akiIf the attribute id of the ith vehicle is 0, judging whether the attribute id of the ith vehicle is 0, if so, establishing a vehicle id which is equal to the id of the ith vehicle and is equal to the count +1, updating the count which is equal to the count +1, and establishing Vid=countAn object and adding it to the set of detected vehicles P ═ { V ═ Vkl=1,Vkl=2…,Vid=countIn (1) }; otherwise, the attribute id of the ith vehicle is unchanged;
if akiIf the association is not 0, associating the ith vehicle with the vehicle in the detected vehicle set, if the association is successful, corresponding the id of the vehicle to the id of the vehicle which is successfully associated, adding the id into the vehicle set P' which is successfully associated, if the association is failed, establishing a new vehicle id for the ith vehicle, wherein the id is count +1, updating the count which is count +1, and creating an object Vid=countAnd add the object to the set of detected vehicles P ═ { V ═ Vid=1,Vid=2,…,Vid=countIn (1) };
after traversing all vehicles in the k-th frame data, a detected vehicle set P ═ { V ═ is obtainedid=1,Vid=2,…,Vid=countV ' and a successfully associated set of vehicles P ' ═ V 'id=1,V′id=2,…,V′id=count′},count′≤count;
4) Calculating the average speed of the vehicle for each vehicle in the successfully correlated vehicle set
Wherein k isindex′,sNumber of data frame, k, indicating the first appearance of vehicle with vehicle id indexindex′,eIndicating the data frame that occurred the last times in the vehicle with vehicle id indexThe serial number of the serial number,
Figure FDA0002239648260000022
the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,eThe x-axis coordinate in the frame data,the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,sThe x-axis coordinate in the frame data,
Figure FDA0002239648260000024
the characteristic point ForE of the vehicle with the vehicle id of index' is at kindex′,eThe y-axis coordinate in the frame data,
Figure FDA0002239648260000025
the characteristic point of the vehicle with the vehicle id of index' is at the k-thindex′,sY-axis coordinate, t, in frame dataindex′,sIs the k-thindex′,sTime stamp of frame data frame, tindex′,eIs the k-thindex′,eTime stamp of frame data frame.
2. The lidar-based continuous vehicle speed detection method according to claim 1, wherein the method for obtaining the vehicle-related data of two adjacent frames of data in step 2) comprises:
sequentially selecting the minimum distance S of each row according to the distance matrix obtained in the step 1)iaThe minimum distance is the distance between the a-th vehicle in the k-1 th frame data and the i-th vehicle in the k-th frame data, and then step judges the following two conditions:
condition 1: sia≤Tmax_xy_moveWherein T ismax_xy_moveIs the maximum distance vehicles can travel between adjacent frames;
condition 2: y iso(i)-yo(a)≤Tmax_y_move,Tmax_y_moveIndicating the maximum deviation of the vertical axis between adjacent frames for vehiclesA shift amount indicating an offset amount of the vehicle from a traveling direction;
if both conditions are met, the association is considered successful.
3. The lidar based continuous vehicle speed detection method of claim 1, wherein the value of a in step 3)kiIf not 0, the method for associating the ith vehicle with the vehicle in the detected vehicle set is as follows:
judgment condition 1: whether the associated vehicle is located at or near the lane in which the vehicle to be associated is located, i.e. when the condition is met
Figure FDA0002239648260000031
Then the associated vehicle is considered to be in or near the lane where the vehicle to be associated is located, wherein yO,kiRepresenting a class center vertical coordinate of a vehicle to be associated, wherein the vehicle to be associated is the ith vehicle in the kth frame data;for the class center ordinate of the associated vehicle in the data frame of its last occurrences, T being the vehicle with vehicle id index in the set of detected vehiclesmax_xy_moveRepresents the maximum offset of the vertical axis between adjacent frames for vehicles;
judgment condition 2: whether the associated vehicle appears in a data frame that is closer before a data frame in which the vehicle to be associated appears, when the condition is satisfied
Figure FDA0002239648260000033
Then the associated vehicle is considered to appear in the data frame that is closer before the data frame that the vehicle to be associated appears, where KkiThe serial number of the data frame where the vehicle to be turned off is located,
Figure FDA0002239648260000034
number of data frames, T, representing the last occurrences of the associated vehiclemax_frame_loseRepresenting the maximum number of lost frames allowed;
judgment condition 3: whether the associated vehicle appears before the appearance position of the vehicle to be associated or not is firstly obtained:
Figure FDA0002239648260000035
wherein, v _ direction represents the vehicle motion direction, when the value is +1, the vehicle motion direction is positive along the horizontal axis, when the value is-1, the vehicle motion direction is negative along the horizontal axis, Tv_change_laneIndicating the distance of the lane at which the direction of movement of the vehicle begins to change, yO,kiRepresenting the class center ordinate of the vehicle to be associated, the position relationship between the vehicle to be associated and the associated vehicle should satisfy the condition:
Figure FDA0002239648260000036
wherein x isO,kiAn abscissa representing the class center of the vehicle to be associated,
Figure FDA0002239648260000037
the abscissa of the center of its class in the last occurrences of the associated vehicle in the data frame;
if the three conditions are met simultaneously, the vehicle association is preliminarily judged to be successful;
then, for each associated vehicle and the vehicle to be associated which are successfully associated by the preliminary determination, calculating the distance difference of the positions of the associated vehicle and the vehicle to be associated at the current frame, and then selecting the associated vehicle and the vehicle to be associated which correspond to the minimum value in the distance difference.
4. The lidar-based continuous vehicle speed detection method according to claim 3, wherein for each associated vehicle and vehicle to be associated for which the preliminary determination association is successful, the process of estimating the distance difference between the two at the position of the current frame is as follows:
according to the average speed v of the associated vehicleindex′Estimating the position of the vehicle in the current frame as follows:
Sindex′=vindex′(tki-tindex′e),
k is the data frame number of the current frame, tkiIs the time stamp, t, of the current frameindex′eA timestamp of the last occurrences of the data frame for the associated vehicle;
the distance between the vehicle to be associated and the associated vehicle is:
wherein x isForE,kiX-axis coordinate, y, of characteristic point ForE of vehicle to be associated in current frame dataForE,kiRepresenting the y-axis coordinate of the characteristic point ForE of the vehicle to be associated in the current frame data;
obtaining the distance difference | S between the associated vehicle and the position of the vehicle to be associated at the current frameindex′,ki-Sindex′|。
5. The lidar-based continuous vehicle speed detection method according to claim 3, wherein the process of successfully associating the associated vehicle with the vehicle to be associated corresponding to the minimum value of the distance difference is as follows:
taking the minimum value of the differences between the distances of the vehicle to be associated and all associated vehicles:
ΔSmin_index′,ki=arg min|Sindex′,ki-Sindex′|
min _ index' is the vehicle id with the smallest difference between the distances from all associated vehicles to the vehicle to be associated, when the minimum value satisfies the condition Δ Smin_index′,ki≤Tmin_ΔSIf so, judging that the association is successful and associating the id of the vehicle to be associatedAssigned a value of min _ index', Tmin_ΔSIndicating the allowed error value for the distance estimate.
6. The method according to claim 1, wherein the class center of the vehicle is expressed by using vehicle data in data obtained by the lidar in a form of a vehicle box model, and the vehicle feature points included in the vehicle box model include an E point, an F point, an O point and a combined feature point ForE, wherein the O point represents a center point of the vehicle and is also called the class center of the vehicle.
7. The lidar-based continuous vehicle speed detection method of claim 6, wherein the points E and F are obtained by respectively selecting N points nearest to A, B to take an average of x-axis coordinates and y-axis coordinates thereof, and the time stamp information thereof is an average of time stamps of the N points nearest thereto.
CN201711219157.3A 2017-11-28 2017-11-28 Laser radar-based continuous vehicle speed detection method Active CN107705563B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711219157.3A CN107705563B (en) 2017-11-28 2017-11-28 Laser radar-based continuous vehicle speed detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711219157.3A CN107705563B (en) 2017-11-28 2017-11-28 Laser radar-based continuous vehicle speed detection method

Publications (2)

Publication Number Publication Date
CN107705563A CN107705563A (en) 2018-02-16
CN107705563B true CN107705563B (en) 2020-01-31

Family

ID=61181116

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711219157.3A Active CN107705563B (en) 2017-11-28 2017-11-28 Laser radar-based continuous vehicle speed detection method

Country Status (1)

Country Link
CN (1) CN107705563B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108828608B (en) * 2018-03-29 2021-08-17 苏州大学张家港工业技术研究院 Laser radar background data filtering method in vehicle detection method
EP3621052A1 (en) * 2018-09-05 2020-03-11 VITRONIC Dr.-Ing. Stein Bildverarbeitungssysteme GmbH Method for analysing the driving behaviour of motor vehicles, including autonomous vehicles
CN110378178B (en) * 2018-09-30 2022-01-28 毫末智行科技有限公司 Target tracking method and device
CN109598947B (en) * 2018-12-26 2021-05-11 武汉万集信息技术有限公司 Vehicle identification method and system
CN109814102B (en) * 2019-01-31 2020-10-27 厦门精益远达智能科技有限公司 Single lane superelevation monitoring method, device, equipment and storage medium
US10943132B2 (en) * 2019-04-10 2021-03-09 Black Sesame International Holding Limited Distant on-road object detection
CN110648538B (en) * 2019-10-29 2022-02-01 苏州大学 Traffic information sensing system and method based on laser radar network
CN111540201B (en) * 2020-04-23 2021-03-30 山东大学 Vehicle queuing length real-time estimation method and system based on roadside laser radar

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604448A (en) * 2009-03-16 2009-12-16 北京中星微电子有限公司 A kind of speed-measuring method of moving target and system
CN102722886A (en) * 2012-05-21 2012-10-10 浙江捷尚视觉科技有限公司 Video speed measurement method based on three-dimensional calibration and feature point matching
CN104318782A (en) * 2014-10-31 2015-01-28 浙江力石科技股份有限公司 Expressway video speed measuring method and system for zone overlapping
CN106781537A (en) * 2016-11-22 2017-05-31 武汉万集信息技术有限公司 A kind of overspeed of vehicle grasp shoot method and system
CN107341819A (en) * 2017-05-09 2017-11-10 深圳市速腾聚创科技有限公司 Method for tracking target and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604448A (en) * 2009-03-16 2009-12-16 北京中星微电子有限公司 A kind of speed-measuring method of moving target and system
CN102722886A (en) * 2012-05-21 2012-10-10 浙江捷尚视觉科技有限公司 Video speed measurement method based on three-dimensional calibration and feature point matching
CN104318782A (en) * 2014-10-31 2015-01-28 浙江力石科技股份有限公司 Expressway video speed measuring method and system for zone overlapping
CN106781537A (en) * 2016-11-22 2017-05-31 武汉万集信息技术有限公司 A kind of overspeed of vehicle grasp shoot method and system
CN107341819A (en) * 2017-05-09 2017-11-10 深圳市速腾聚创科技有限公司 Method for tracking target and storage medium

Also Published As

Publication number Publication date
CN107705563A (en) 2018-02-16

Similar Documents

Publication Publication Date Title
CN107705563B (en) Laser radar-based continuous vehicle speed detection method
Chang et al. Argoverse: 3d tracking and forecasting with rich maps
WO2022037387A1 (en) Visual perception algorithm evaluation method and device
CN111429484B (en) Multi-target vehicle track real-time construction method based on traffic monitoring video
CN110942449A (en) Vehicle detection method based on laser and vision fusion
CN104183127B (en) Traffic surveillance video detection method and device
CN104615986B (en) The method that pedestrian detection is carried out to the video image of scene changes using multi-detector
CN105513349B (en) Mountainous area highway vehicular events detection method based on double-visual angle study
WO2016086792A1 (en) Driving behavior analysis method and device
CN102609720B (en) Pedestrian detection method based on position correction model
CN115482195B (en) Train part deformation detection method based on three-dimensional point cloud
CN108645375B (en) Rapid vehicle distance measurement optimization method for vehicle-mounted binocular system
CN103487034A (en) Method for measuring distance and height by vehicle-mounted monocular camera based on vertical type target
CN105608431A (en) Vehicle number and traffic flow speed based highway congestion detection method
CN109887273B (en) Bridge live load optimization identification method based on multi-source redundant information
CN114758504B (en) Online vehicle overspeed early warning method and system based on filtering correction
CN106570490A (en) Pedestrian real-time tracking method based on fast clustering
CN106548131A (en) A kind of workmen's safety helmet real-time detection method based on pedestrian detection
CN115144399B (en) Assembly quality detection method and device based on machine vision
CN112990128A (en) Multi-vehicle speed measuring method based on video tracking
Cordes et al. Roadsaw: A large-scale dataset for camera-based road surface and wetness estimation
CN108460348B (en) Road target detection method based on three-dimensional model
CN107886541A (en) Monocular movement object pose method for real-time measurement based on back projection method
CN103473925B (en) A kind of verification method of road vehicle detection system
CN115657002A (en) Vehicle motion state estimation method based on traffic millimeter wave radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant