CN109285163B - Laser point cloud based lane line left and right contour line interactive extraction method - Google Patents

Laser point cloud based lane line left and right contour line interactive extraction method Download PDF

Info

Publication number
CN109285163B
CN109285163B CN201811032540.2A CN201811032540A CN109285163B CN 109285163 B CN109285163 B CN 109285163B CN 201811032540 A CN201811032540 A CN 201811032540A CN 109285163 B CN109285163 B CN 109285163B
Authority
CN
China
Prior art keywords
point
points
lane line
line
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811032540.2A
Other languages
Chinese (zh)
Other versions
CN109285163A (en
Inventor
惠念
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heading Data Intelligence Co Ltd
Original Assignee
Heading Data Intelligence Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Heading Data Intelligence Co Ltd filed Critical Heading Data Intelligence Co Ltd
Priority to CN201811032540.2A priority Critical patent/CN109285163B/en
Publication of CN109285163A publication Critical patent/CN109285163A/en
Application granted granted Critical
Publication of CN109285163B publication Critical patent/CN109285163B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a laser point cloud-based lane line left and right contour line interactive extraction method, which comprises the following steps: s1, manually appointing N lane line starting points on the laser point cloud operation platform; s2, sequentially executing S3-S6 at each starting point; s3, dividing the point cloud by the polygon; s4, filtering the point cloud after segmentation; s5, adjusting coordinates of the point pairs by combining the reference reflection intensity and the standard lane line width; s6, if the number of the collected left and right contour line points of the lane line is 0, finishing the calculation of the lane line corresponding to the current starting point, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated; otherwise, taking the midpoint of the tail point pair of the left and right contour lines of the currently acquired lane line as a new starting point, continuing to calculate until the number of the shape points of the left and right contour lines of the acquired lane line is 0, ending the calculation of the lane line corresponding to the current starting point, and starting to execute S3-S6 on the next starting point until all the starting points are calculated.

Description

Laser point cloud based lane line left and right contour line interactive extraction method
Technical Field
The invention belongs to the field of high-precision electronic maps, and particularly relates to a laser point cloud-based interactive extraction method for left and right contour lines of a lane line.
Background
In recent years, the automatic driving technology has been rapidly developed. As one of the necessary sensors for automatically driving a car, a high-precision electronic map is provided with high-precision lane-level road shape information, POI information, ADAS information, topology information, and the like. The shape and position information of roads and POI can be used for high-precision positioning, and topological information and ADAS information are constructed. The topological information of the road network can be used for path planning, the topological information of roads and POIs can be used for POI retrieval along the way, and ADAS information is used for an advanced driving assistance system. At present, the shapes and positions of roads and POI of high-precision maps are mainly extracted from laser point clouds. The laser point cloud is a set of mass points which acquire target space distribution and surface characteristics by using a high-precision laser scanning technology. The extraction of the left and right contour lines of the lane line based on the laser point cloud is to obtain the shape characteristic points of the left and right contour lines of the lane line from the laser point cloud, and the characteristic points are orderly and have real position information and data volume far smaller than the data volume of the original laser point cloud.
The lane line acquisition is an important time-consuming link in the manufacturing process of a high-precision electronic map and is also an important step for ensuring the high precision of the map to continue laser point cloud. The existing lane line extraction method based on the laser point cloud comprises the steps of pure manual collection, point cloud conversion into an image, hough transformation extraction, point cloud conversion into an image, deep learning extraction, point cloud reflection intensity filtering, connectivity identification extraction and the like. The pure manual collection method is high in plane precision relative to point cloud, but due to the fact that the collection shape point interval is uncertain, the uncertainty of elevation precision can be caused, time consumption is long, the collected lane line precision is greatly related to the responsibility center of an operator, and cost is high. The point cloud is converted into an image, then, a traditional image identification edge detection means is used on the image, a method for extracting the lane line by Hough transform cannot be effectively applied to a curve, and precision loss exists to a certain degree when an image coordinate and a point cloud real coordinate are mutually converted. The point cloud is converted into an image, the lane line is extracted by using a deep learning method, strong dependence is caused on the marking quality of the lane line, a large number of samples need to be obtained, the production of the sample data needs a large amount of labor investment, and the problem of precision loss when the image coordinate and the point cloud real coordinate are mutually converted cannot be avoided. The method for extracting the lane line directly on the point cloud based on the reflection intensity filtering and the connectivity identification provides the accuracy of the lane line width under the condition of good effect.
Disclosure of Invention
In view of the above, the invention provides an interactive extraction method for left and right contour lines of a lane line based on laser point cloud.
A method for interactively extracting left and right contour lines of a lane line based on laser point cloud comprises the following steps:
s1, manually appointing N lane line starting points on the laser point cloud operation platform, wherein N > is 1, and selecting manual guide acquisition or track guide acquisition; when selecting track guidance, the source data is required to be accompanied with track data matched with point cloud as a guide point set; when manual guide acquisition is selected, a guide point along the extension direction of the lane line needs to be input;
s2, sequentially executing S3-S6 at each starting point;
s3, finding the guide point nearest to the starting point, obtaining the heading of the guide point, calculating a left guide line and a right guide line respectively from the starting point along the heading forward and the vertical heading leftward and rightward, connecting the left guide line and the right guide line into a polygon, and dividing the point cloud by using the polygon;
s4, calculating the probability distribution of the reflection intensity of the divided point cloud, and taking the larger reflection intensity with a certain probability as the reference reflection intensity to filter the divided point cloud;
s5, traversing shape points on the left guide line and the right guide line, respectively retrieving the closest points on the point cloud after segmentation obtained in the step S4, and obtaining point pairs on left and right side contour lines on a lane line; adjusting coordinates of the point pairs by combining the reference reflection intensity and the standard lane line width;
s6, if the number of the collected left and right contour line points of the lane line is 0, finishing the calculation of the lane line corresponding to the current starting point, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated; otherwise, taking the midpoint of the tail point pair of the left and right contour lines of the currently acquired lane line as a new starting point, continuing to calculate until the number of the shape points of the left and right contour lines of the acquired lane line is 0, finishing the calculation of the lane line corresponding to the current starting point, and starting to execute S3-S6 on the next starting point until all the starting points are calculated; if the current guidance mode is manual guidance, the collected midpoint set of the left and right contour line point pairs is required to be used as a new guidance point set;
s7, smoothing the plane and the elevation of the acquired left and right contour lines of each lane line, calculating the length of the left and right contour lines respectively, taking the lane line with the highest reliability as a reference, projecting the positions of the head and tail points of other lane lines onto the lane line, intercepting the part of the reference lane line longer than the other projections, and supplementing the part of the reference lane line to the tail ends of other lane lines;
and S8, calculating the central points of the tail points of the left and right contour lines of each lane line to serve as new starting points, and executing S2 until the maximum length in the lane contour lines calculated by each starting point is smaller than the preset length.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S1 includes:
s11, judging whether the laser point cloud in the current area has a matched track; if the lane line shape is similar to the track, namely the lane expansion or lane reduction does not exist in the road, and the lane change does not exist in the track, selecting a track guide lane line acquisition button to perform track guide acquisition; if the condition is not met, selecting a manual guidance lane line acquisition button to perform manual guidance acquisition; the lane line is N, and N > is 1;
s12, when guiding and collecting the track, marking a point in the white line of each lane line of the laser point cloud; when manual guiding collection is carried out, dotting is carried out in the white line of each lane line of the laser point cloud, a point is marked on the white line where the last point is located along the extending direction of the lane line, and the heading h of the last two points is calculated.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S3 includes:
s31, taking the input starting point as a base point P1, searching a guide point G1 which is closest to P1 from the guide points, and acquiring a heading h1 of the guide point G1;
s32, calculating left reference point L1 and right reference point R1 by the left and right distances sd of the vertical heading h1 at P1; calculating the next base point P2 forward fd along the heading h1, searching a guide point G2 which is closest to P2 from the guide points, and acquiring a heading h2 of the guide point G2;
s33, performing forward M times along the guide point in sequence according to the method of S32 to obtain a left guide line point L1-LM and a right guide line point R1-RM; connecting the left guide line points in sequence and the right guide line points in reverse order to construct a polygon;
and S34, calling a PCL point cloud processing library, and dividing the point cloud by using polygon to obtain point cloud polygon.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S4 includes:
s41, traversing each point of the polygon PC, counting the number of the reflection intensity in each interval with interval, and dividing the number by the total number of the points, namely the probability value in each reflection intensity interval; according to the statistical value, obtaining a maximum probability interval, namely the reflection intensity of the non-lane line point cloud on the road, taking the initial value of the maximum reflection intensity interval as the maximum reflection intensity maxi, searching from the maximum reflection intensity interval to the minimum interval, regarding the interval with the first probability greater than prob as the reflection intensity whitei of the lane line point cloud, and taking the larger value of the reflection intensity whitei and the base point P1 as the base reflection intensity basei;
s42, calling a PCL point cloud processing library, and filtering the reflection intensity of the polygonPC by using the lower limit basei and the upper limit maxi to obtain the point cloud intensityPC.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S5 includes:
s51, traversing shape points L1-LM on the left guide line, calling PCL, and retrieving the closest points PL 1-PLM in the three-dimensional space on the intensityPC by using Kdtree FLANN; traversing shape points R1-RM on the right guide line, calling PCL, and using KdTreeFLANN to retrieve the closest points PR 1-PRM in the three-dimensional space on the intensityPC;
s52, calculating a point pair (PLi, PRI), wherein i is a distance lrWidth between 1 and M, if lrWidth is smaller than a standard lane line width stdWidth, calculating center points Ci of the PLi and the PRI, searching a heading hi of a guide point closest to the Ci, and calculating calcPli and calcPri to the left and the right of a vertical heading respectively; let PLi ═ calcPLi, PRi ═ calcPRi;
s53, calculating central points Ci of the PLi and the PRI, if the reflection intensity of the Ci is smaller than the reflection intensity whitei of the lane line, comparing the reflection intensity values of the PLi and the PRI, and if the reflection intensity of the PLi is smaller than the reflection intensity of the PRI, moving the two points of the PLi and the PRI to wdist in the direction of the PRI; if the reflection intensity of PLi is greater than that of PRI, two points of PLi and PRI are shifted in the direction of PLi by wdist.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S6 includes:
s61, if the number of the collected left and right contour line points of the lane line is 0, finishing the calculation of the current lane line, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated; otherwise, executing S62;
s62, if the current guidance mode is manual guidance, using the collected midpoint set of the left and right contour line point pairs as a loading guidance point set; if the guiding mode is track guiding, the guiding mode is not needed;
s63, calculating the midpoint CM of the tail points PLM and PRM of the left and right contour lines of the currently acquired lane line, executing S3-S5 by taking the CM as a new starting point until the number of the shape points of the left and right contour lines of the acquired lane line is 0, ending the calculation of the current lane line, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S7 includes:
s71, executing the step S72 on each acquired lane line;
s72, traversing the shape points on the lane line, starting from the second shape point, calculating the plane distance from the adjacent last shape point, and if the plane distance is smaller than minDist, rejecting the current shape point;
s73: traversing the result of the S72 processing, starting from the second shape point, calculating the included angle of the vector formed by the previous shape point and the next shape point which are adjacent to the second shape point, and if the included angle is not within the interval [ pi-theta, pi + theta ], rejecting the current shape point;
s74: traversing the result of the processing of S73, starting from the second shape point, calculating the z value difference of the last shape point adjacent to the second shape point, and if the difference is greater than zthreshold, modifying the z of the current shape point to make the difference be zthreshold;
s75: and calculating the length of each left lane line contour line, taking the longest one of the left lane line contour lines as a reference line, calculating projection points from tail points of other lane lines to the reference line, dividing the reference line by the projection points, acquiring a part between the projection points and the tail part of the reference line, adding coordinate offset of the projection points and the original tail points, and copying the part to the tail part of the original lane line.
Compared with the prior art, the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud has the following beneficial effects: the PCL point cloud open source library is used for carrying out rapid calculation of the point cloud, the high efficiency of point cloud calculation can be ensured, the point cloud does not need to be converted into an image, and the problem of precision loss during mutual conversion of the point cloud and the image can be avoided. And the left and right contour lines of the lane line are positions with obvious change of the reflection intensity characteristic, and the left and right contour lines are extracted, so that the method has important significance for ensuring the relative positions of the lane line and the point cloud.
Drawings
FIG. 1 is a flow chart of an interactive extraction method for left and right contour lines of a lane based on laser point cloud;
FIG. 2 is a schematic diagram of the left and right contour lines and shape points of a lane line;
FIG. 3 is a schematic diagram of user input points and trajectory guide points;
FIG. 4 is a schematic diagram of a user input point and a manual guide point;
FIG. 5 is a schematic diagram of a point cloud segmentation polygon and left and right reference points;
FIG. 6 is a schematic diagram of left and right contour lines of a plurality of lane lines on a point cloud;
FIG. 7 is a schematic diagram of fine tuning calculation of the shape points of the left and right contour lines of the lane lines.
Detailed Description
As shown in fig. 1 to 7, an interactive extraction method of left and right contour lines of a lane line based on laser point cloud includes the following steps:
s1, manually appointing N lane line starting points on the laser point cloud operation platform, wherein N > is 1, and selecting manual guide acquisition or track guide acquisition; when selecting track guidance, the source data is required to be accompanied with track data matched with point cloud as a guide point set; when manual guide acquisition is selected, a guide point along the extension direction of the lane line needs to be input;
s2, sequentially executing S3-S6 at each starting point;
s3, finding the guide point nearest to the starting point, obtaining the heading of the guide point, calculating a left guide line and a right guide line respectively from the starting point along the heading forward and the vertical heading leftward and rightward, connecting the left guide line and the right guide line into a polygon, and dividing the point cloud by using the polygon;
and the included angle between the vector formed by the guide point and the next adjacent guide point and the due north direction is the heading of the guide point.
S4, calculating the probability distribution of the reflection intensity of the divided point cloud, and taking the larger reflection intensity with a certain probability as the reference reflection intensity to filter the divided point cloud;
the probability in a certain probability is less than 50%.
S5, traversing shape points on the left guide line and the right guide line, respectively retrieving the closest points on the point cloud after segmentation obtained in the step S4, and obtaining point pairs on left and right side contour lines on a lane line; adjusting coordinates of the point pairs by combining the reference reflection intensity and the standard lane line width;
s6, if the number of the collected left and right contour line points of the lane line is 0, finishing the calculation of the lane line corresponding to the current starting point, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated; otherwise, taking the midpoint of the tail point pair of the left and right contour lines of the currently acquired lane line as a new starting point, continuing to calculate until the number of the shape points of the left and right contour lines of the acquired lane line is 0, finishing the calculation of the lane line corresponding to the current starting point, and starting to execute S3-S6 on the next starting point until all the starting points are calculated; if the current guidance mode is manual guidance, the collected midpoint set of the left and right contour line point pairs is required to be used as a new guidance point set;
s7, smoothing the plane and the elevation of the acquired left and right contour lines of each lane line, calculating the length of the left and right contour lines respectively, taking the lane line with the highest reliability as a reference, projecting the positions of the head and tail points of other lane lines onto the lane line, intercepting the part of the reference lane line longer than the other projections, and supplementing the part of the reference lane line to the tail ends of other lane lines;
elevation refers to the z value of the contour line obtained from the laser point cloud. The highest reliability means that the lane line with the longest length is taken as a reference.
And S8, calculating the central points of the tail points of the left and right contour lines of each lane line to serve as new starting points, and executing S2 until the maximum length in the lane contour lines calculated by each starting point is smaller than the preset length. The preset length may be less than 1 m.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S1 includes:
s11, judging whether the laser point cloud in the current area has a matched track; if the lane line shape is similar to the track, namely the lane expansion or lane reduction does not exist in the road, and the lane change does not exist in the track, selecting a track guide lane line acquisition button to perform track guide acquisition; if the condition is not met, selecting a manual guidance lane line acquisition button to perform manual guidance acquisition; the lane line is N, and N > is 1; by similar is meant that the law is parallel to the lane printing line on the point cloud.
S12, when guiding and collecting the track, marking a point in the white line of each lane line of the laser point cloud; when manual guiding collection is carried out, dotting is carried out in the white line of each lane line of the laser point cloud, a point is marked on the white line where the last point is located along the extending direction of the lane line, and the heading h of the last two points is calculated.
In step S12, a point marked inside the white line is a starting point, and when manual guidance is performed, the remaining points except the last point along the extension direction of the lane line have dual functions of the starting point and the guidance point. The points along the extension of the lane line and the last point along the extension of the lane line are used to calculate the heading h. An assignment is made to each starting point. Each origin has a heading attribute.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S3 includes:
s31, taking the input starting point as a base point P1, searching a guide point G1 which is closest to P1 from the guide points, and acquiring a heading h1 of the guide point G1;
s32, calculating left reference point L1 and right reference point R1 by the left and right distances sd of the vertical heading h1 at P1; calculating the next base point P2 forward fd along the heading h1, searching a guide point G2 which is closest to P2 from the guide points, and acquiring a heading h2 of the guide point G2;
s33, performing forward M times along the guide point in sequence according to the method of S32 to obtain a left guide line point L1-LM and a right guide line point R1-RM; connecting the left guide line points in sequence and the right guide line points in reverse order to construct a polygon;
and S34, calling a PCL point cloud processing library, and dividing the point cloud by using polygon to obtain point cloud polygon.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S4 includes:
s41, traversing each point of the polygon PC, counting the number of the reflection intensity in each interval with interval, and dividing the number by the total number of the points, namely the probability value in each reflection intensity interval; according to the statistical value, obtaining a maximum probability interval, namely the reflection intensity of the point cloud of the non-lane line on the road, taking the initial value of the maximum reflection intensity interval as the maximum reflection intensity maxi, searching from the maximum reflection intensity interval to the minimum interval, determining the interval with the first probability being greater than prob (optionally, prob value being 10%) as the reflection intensity whitei of the point cloud of the lane line, and taking the larger value of the reflection intensity whitei and the base point P1 as the base reflection intensity basei;
s42, calling a PCL point cloud processing library, and filtering the reflection intensity of the polygonPC by using the lower limit basei and the upper limit maxi to obtain the point cloud intensityPC.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S5 includes:
s51, traversing shape points L1-LM on the left guide line, calling PCL, and retrieving the closest points PL 1-PLM in the three-dimensional space on the intensityPC by using Kdtree FLANN; traversing shape points R1-RM on the right guide line, calling PCL, and using KdTreeFLANN to retrieve the closest points PR 1-PRM in the three-dimensional space on the intensityPC;
s52, calculating a point pair (PLi, PRI), wherein i is a distance lrWidth between 1 and M, if lrWidth is smaller than a standard lane line width stdWidth, calculating center points Ci of the PLi and the PRI, searching a heading hi of a guide point closest to the Ci, and calculating calcPli and calcPri to the left and the right of a vertical heading respectively; let PLi ═ calcPLi, PRi ═ calcPRi;
s53, calculating central points Ci of the PLi and the PRI, if the reflection intensity of the Ci is smaller than the reflection intensity whitei of the lane line, comparing the reflection intensity values of the PLi and the PRI, and if the reflection intensity of the PLi is smaller than the reflection intensity of the PRI, moving the two points of the PLi and the PRI to wdist in the direction of the PRI; if the reflection intensity of PLi is greater than that of PRI, two points of PLi and PRI are shifted in the direction of PLi by wdist.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S6 includes:
s61, if the number of the collected left and right contour line points of the lane line is 0, finishing the calculation of the current lane line, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated; otherwise, executing S62;
s62, if the current guidance mode is manual guidance, using the collected midpoint set of the left and right contour line point pairs as a loading guidance point set; if the guiding mode is track guiding, the guiding mode is not needed;
s63, calculating the midpoint CM of the tail points PLM and PRM of the left and right contour lines of the currently acquired lane line, executing S3-S5 by taking the CM as a new starting point until the number of the shape points of the left and right contour lines of the acquired lane line is 0, ending the calculation of the current lane line, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated.
In the interactive extraction method of the left and right contour lines of the lane line based on the laser point cloud,
the step S7 includes:
s71, executing the step S72 on each acquired lane line;
s72, traversing the shape points on the lane line, starting from the second shape point, calculating the plane distance from the adjacent last shape point, and if the plane distance is smaller than minDist, rejecting the current shape point; and minDist is the step of eliminating the current point when the distance from the current point to the last adjacent point is less than a given threshold, and optionally, the minDist value can be 1 m.
S73: traversing the result of the S72 processing, starting from the second shape point, calculating the included angle of the vector formed by the previous shape point and the next shape point which are adjacent to the second shape point, and if the included angle is not within the interval [ pi-theta, pi + theta ], rejecting the current shape point; π is the circumferential ratio and θ is an angle threshold. The value may be pi/6. The included angle of the vector refers to a vector formed by the current point and two adjacent points in front and at the back, and when the included angle exceeds a given threshold value, the current point is rejected if the line at the position has a sharp corner.
S74: traversing the result of the processing of S73, starting from the second shape point, calculating the z value difference of the last shape point adjacent to the second shape point, and if the difference is greater than zthreshold, modifying the z of the current shape point to make the difference be zthreshold; zthreshold is a threshold for the z value. And modifying the z value of the current point when the z values of the current point and the last adjacent point exceed a given threshold value.
S75: and calculating the length of each left lane line contour line, taking the longest one of the left lane line contour lines as a reference line, calculating projection points from tail points of other lane lines to the reference line, dividing the reference line by the projection points, acquiring a part between the projection points and the tail part of the reference line, adding coordinate offset of the projection points and the original tail points, and copying the part to the tail part of the original lane line.
The embodiment of the invention adopts an interactive automatic extraction method, because the existing full-automatic extraction method cannot be applied in large scale in engineering projects. The automatic extraction method with the advantages of diversified roads and manual intervention can simplify the operation complexity of lane line extraction, is assisted by flexible UI interaction, can facilitate operators to quickly modify the positions with unsatisfactory algorithm detection effect, realizes the efficient combination of manual work and algorithm, and can better process various road conditions encountered in actual high-precision map production. The PCL point cloud open source library is used for performing rapid calculation of the point cloud, so that the high efficiency of point cloud calculation can be ensured, the point cloud does not need to be converted into an image, and the problem of precision loss during mutual conversion of the point cloud and the image can be avoided. And the left and right contour lines of the lane line are positions with obvious change of the reflection intensity characteristic, and the left and right contour lines are extracted, so that the method has important significance for ensuring the relative positions of the lane line and the point cloud.
It is understood that various other changes and modifications may be made by those skilled in the art based on the technical idea of the present invention, and all such changes and modifications should fall within the protective scope of the claims of the present invention.

Claims (7)

1. A method for interactively extracting left and right contour lines of a lane line based on laser point cloud is characterized by comprising the following steps:
s1, manually appointing N lane line starting points on the laser point cloud operation platform, wherein N > is 1, and selecting manual guide acquisition or track guide acquisition; when selecting track guidance, the source data is required to be accompanied with track data matched with point cloud as a guide point set; when manual guide acquisition is selected, a guide point along the extension direction of the lane line needs to be input;
s2, sequentially executing S3-S6 at each starting point;
s3, finding the guide point nearest to the starting point, obtaining the heading of the guide point, calculating a left guide line and a right guide line respectively from the starting point along the heading forward and the vertical heading leftward and rightward, connecting the left guide line and the right guide line into a polygon, and dividing the point cloud by using the polygon;
s4, calculating the probability distribution of the reflection intensity of the divided point cloud, and taking the larger reflection intensity with a certain probability as the reference reflection intensity to filter the divided point cloud;
s5, traversing shape points on the left guide line and the right guide line, respectively retrieving the closest points on the point cloud after segmentation obtained in the step S4, and obtaining point pairs on left and right side contour lines on a lane line; adjusting coordinates of the point pairs by combining the reference reflection intensity and the standard lane line width; the shape points are shape characteristic points;
s6, if the number of the collected left and right contour line points of the lane line is 0, finishing the calculation of the lane line corresponding to the current starting point, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated; otherwise, taking the midpoint of the tail point pair of the left and right contour lines of the currently acquired lane line as a new starting point, continuing to calculate until the number of the shape points of the left and right contour lines of the acquired lane line is 0, finishing the calculation of the lane line corresponding to the current starting point, and starting to execute S3-S6 on the next starting point until all the starting points are calculated; if the current guidance mode is manual guidance, the collected midpoint set of the left and right contour line point pairs is required to be used as a new guidance point set;
s7, smoothing the plane and the elevation of the acquired left and right contour lines of each lane line, calculating the length of the left and right contour lines respectively, taking the lane line with the highest reliability as a reference, projecting the positions of the head and tail points of other lane lines onto the lane line, intercepting the part of the reference lane line longer than the other projections, and supplementing the part of the reference lane line to the tail ends of other lane lines;
and S8, calculating the central points of the tail points of the left and right contour lines of each lane line to serve as new starting points, and executing S2 until the maximum length in the lane contour lines calculated by each starting point is smaller than the preset length.
2. The interactive extraction method of left and right contour lines of lane lines based on laser point cloud as claimed in claim 1,
the step S1 includes:
s11, judging whether the laser point cloud in the current area has a matched track; if the lane line shape is similar to the track, namely the lane expansion or lane reduction does not exist in the road, and the lane change does not exist in the track, selecting a track guide lane line acquisition button to perform track guide acquisition; if the laser point cloud of the current area does not have a matched track, or the laser point cloud of the current area has a matched track but the track has a lane change, selecting a manual guidance lane line acquisition button to perform manual guidance acquisition; the lane line is N, and N > is 1;
s12, when guiding and collecting the track, marking a point in the white line of each lane line of the laser point cloud; when manual guiding collection is carried out, dotting is carried out in the white line of each lane line of the laser point cloud, a point is marked on the white line where the last point is located along the extending direction of the lane line, and the heading h of the last two points is calculated.
3. The interactive extraction method of left and right contour lines of lane lines based on laser point cloud of claim 2,
the step S3 includes:
s31, taking the input starting point as a base point P1, searching a guide point G1 which is closest to P1 from the guide points, and acquiring a heading h1 of the guide point G1;
s32, calculating left reference point L1 and right reference point R1 by the left and right distances sd of the vertical heading h1 at P1; calculating the next base point P2 forward fd along the heading h1, searching a guide point G2 which is closest to P2 from the guide points, and acquiring a heading h2 of the guide point G2;
s33, performing forward M times along the guide point in sequence according to the method of S32 to obtain a left guide line point L1-LM and a right guide line point R1-RM; connecting the left guide line points in sequence and the right guide line points in reverse order to construct a polygon;
and S34, calling a PCL point cloud processing library, and dividing the point cloud by using polygon to obtain point cloud polygon.
4. The interactive extraction method of left and right contour lines of lane lines based on laser point cloud of claim 3,
the step S4 includes:
s41, traversing each point of the polygon PC, counting the number of the reflection intensity in each interval with interval, and dividing the number by the total number of the points, namely the probability value in each reflection intensity interval; according to the statistical value, obtaining a maximum probability interval, namely the reflection intensity of the non-lane line point cloud on the road, taking the initial value of the maximum reflection intensity interval as the maximum reflection intensity maxi, searching from the maximum reflection intensity interval to the minimum interval, regarding the interval with the first probability greater than prob as the reflection intensity whitei of the lane line point cloud, and taking the larger value of the reflection intensity whitei and the base point P1 as the base reflection intensity basei;
s42, calling a PCL point cloud processing library, and filtering the reflection intensity of the polygonPC by using the lower limit basei and the upper limit maxi to obtain the point cloud intensityPC.
5. The interactive extraction method of left and right contour lines of lane lines based on laser point cloud of claim 4,
the step S5 includes:
s51, traversing shape points L1-LM on the left guide line, calling PCL, and retrieving the closest points PL 1-PLM in the three-dimensional space on the intensityPC by using Kdtree FLANN; traversing shape points R1-RM on the right guide line, calling PCL, and using KdTreeFLANN to retrieve the closest points PR 1-PRM in the three-dimensional space on the intensityPC;
s52, calculating a point pair (PLi, PRI), wherein i is a distance lrWidth between 1 and M, if lrWidth is smaller than a standard lane line width stdWidth, calculating center points Ci of the PLi and the PRI, searching a heading hi of a guide point closest to the Ci, and calculating calcPli and calcPri to the left and the right of a vertical heading respectively; let PLi ═ calcPLi, PRi ═ calcPRi;
s53, calculating central points Ci of the PLi and the PRI, if the reflection intensity of the Ci is smaller than the reflection intensity whitei of the lane line, comparing the reflection intensity values of the PLi and the PRI, and if the reflection intensity of the PLi is smaller than the reflection intensity of the PRI, moving the two points of the PLi and the PRI to wdist in the direction of the PRI; if the reflection intensity of PLi is greater than that of PRI, two points of PLi and PRI are shifted in the direction of PLi by wdist.
6. The interactive extraction method of left and right contour lines of lane lines based on laser point cloud of claim 5,
the step S6 includes:
s61, if the number of the collected left and right contour line points of the lane line is 0, finishing the calculation of the current lane line, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated; otherwise, executing S62;
s62, if the current guidance mode is manual guidance, using the collected midpoint set of the left and right contour line point pairs as a loading guidance point set; if the guiding mode is track guiding, the guiding mode is not needed;
s63, calculating the midpoint CM of the tail points PLM and PRM of the left and right contour lines of the currently acquired lane line, executing S3-S5 by taking the CM as a new starting point until the number of the shape points of the left and right contour lines of the acquired lane line is 0, ending the calculation of the current lane line, and starting to execute the steps S3-S6 for the next starting point until all the starting points are calculated.
7. The interactive extraction method of left and right contour lines of lane lines based on laser point cloud of claim 6,
the step S7 includes:
s71, executing the step S72 on each acquired lane line;
s72, traversing the shape points on the lane line, starting from the second shape point, calculating the plane distance from the adjacent last shape point, and if the plane distance is smaller than minDist, rejecting the current shape point;
s73: traversing the result of the S72 processing, starting from the second shape point, calculating the included angle of the vector formed by the previous shape point and the next shape point which are adjacent to the second shape point, and if the included angle is not within the interval [ pi-theta, pi + theta ], rejecting the current shape point;
s74: traversing the result of the processing of S73, starting from the second shape point, calculating the z value difference of the last shape point adjacent to the second shape point, and if the difference is greater than zthreshold, modifying the z of the current shape point to make the difference be zthreshold; the z value is a value in the vertical direction;
s75: and calculating the length of each left lane line contour line, taking the longest one of the left lane line contour lines as a reference line, calculating projection points from tail points of other lane lines to the reference line, dividing the reference line by the projection points, acquiring a part between the projection points and the tail part of the reference line, adding coordinate offset of the projection points and the original tail points, and copying the part to the tail part of the original lane line.
CN201811032540.2A 2018-09-05 2018-09-05 Laser point cloud based lane line left and right contour line interactive extraction method Active CN109285163B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811032540.2A CN109285163B (en) 2018-09-05 2018-09-05 Laser point cloud based lane line left and right contour line interactive extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811032540.2A CN109285163B (en) 2018-09-05 2018-09-05 Laser point cloud based lane line left and right contour line interactive extraction method

Publications (2)

Publication Number Publication Date
CN109285163A CN109285163A (en) 2019-01-29
CN109285163B true CN109285163B (en) 2021-10-08

Family

ID=65183550

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811032540.2A Active CN109285163B (en) 2018-09-05 2018-09-05 Laser point cloud based lane line left and right contour line interactive extraction method

Country Status (1)

Country Link
CN (1) CN109285163B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110060266B (en) * 2019-04-24 2021-04-13 百度在线网络技术(北京)有限公司 Lane line extraction method and apparatus, server, and computer-readable medium
CN110363771B (en) * 2019-07-15 2021-08-17 武汉中海庭数据技术有限公司 Isolation guardrail shape point extraction method and device based on three-dimensional point cloud data
CN112487123A (en) * 2020-12-05 2021-03-12 武汉中海庭数据技术有限公司 Road connectivity testing method and system based on large-range high-precision map
CN112598075B (en) * 2020-12-29 2022-07-29 武汉中海庭数据技术有限公司 Crowdsourcing data multi-road segment elevation processing method and device
CN115953752B (en) * 2023-03-07 2023-07-21 中汽创智科技有限公司 Lane reference line extraction method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101192307A (en) * 2006-11-17 2008-06-04 鸿富锦精密工业(深圳)有限公司 Point cloud triangular topological relations construction method
CN105488498A (en) * 2016-01-15 2016-04-13 武汉光庭信息技术股份有限公司 Lane sideline automatic extraction method and lane sideline automatic extraction system based on laser point cloud
CN108088445A (en) * 2016-11-22 2018-05-29 广州映博智能科技有限公司 3 d grid map path planning system and method based on octree representation
CN108133226A (en) * 2017-11-27 2018-06-08 西北工业大学 One kind is based on the improved three-dimensional point cloud feature extracting methods of HARRIS

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140198097A1 (en) * 2013-01-16 2014-07-17 Microsoft Corporation Continuous and dynamic level of detail for efficient point cloud object rendering
CN107844115B (en) * 2016-09-20 2019-01-29 北京百度网讯科技有限公司 Data capture method and device for automatic driving vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101192307A (en) * 2006-11-17 2008-06-04 鸿富锦精密工业(深圳)有限公司 Point cloud triangular topological relations construction method
CN105488498A (en) * 2016-01-15 2016-04-13 武汉光庭信息技术股份有限公司 Lane sideline automatic extraction method and lane sideline automatic extraction system based on laser point cloud
CN108088445A (en) * 2016-11-22 2018-05-29 广州映博智能科技有限公司 3 d grid map path planning system and method based on octree representation
CN108133226A (en) * 2017-11-27 2018-06-08 西北工业大学 One kind is based on the improved three-dimensional point cloud feature extracting methods of HARRIS

Also Published As

Publication number Publication date
CN109285163A (en) 2019-01-29

Similar Documents

Publication Publication Date Title
CN109285163B (en) Laser point cloud based lane line left and right contour line interactive extraction method
EP3792901B1 (en) Ground mark extraction method, model training method, device and storage medium
CN110148196B (en) Image processing method and device and related equipment
CN111273305A (en) Multi-sensor fusion road extraction and indexing method based on global and local grid maps
CN109376586B (en) Road boundary line interactive automatic extraction method based on laser point cloud
CN108280840B (en) Road real-time segmentation method based on three-dimensional laser radar
CN102147250A (en) Digital line graph mapping method
CN111209291B (en) Method and system for updating high-precision map by using crowdsourcing perception map
CN111897365B (en) Autonomous vehicle three-dimensional path planning method for contour line guide line
CN108596165A (en) Road traffic marking detection method based on unmanned plane low latitude Aerial Images and system
EP2887315A1 (en) Calibration device, method for implementing calibration, program and camera for movable body
CN104121902A (en) Implementation method of indoor robot visual odometer based on Xtion camera
CN110458083B (en) Lane line vectorization method, device and storage medium
CN110956100A (en) High-precision map generation method and device, electronic equipment and storage medium
CN105956542B (en) High-resolution remote sensing image road extraction method based on statistical matching of structural wire harnesses
CN106500594B (en) Merge the railroad track method for semi-automatically detecting of reflected intensity and geometric properties
CN109544607A (en) A kind of cloud data registration method based on road mark line
CN112435336B (en) Curve type identification method and device, electronic equipment and storage medium
CN114140466B (en) Plant root system measuring method, system and device based on image processing
CN110060266B (en) Lane line extraction method and apparatus, server, and computer-readable medium
CN114593739A (en) Vehicle global positioning method and device based on visual detection and reference line matching
CN111209805A (en) Rapid fusion optimization method for multi-channel segment data of lane line crowdsourcing data
CN112418193B (en) Lane line identification method and system
CN113971723A (en) Method, device, equipment and storage medium for constructing three-dimensional map in high-precision map
CN106934832B (en) A kind of simple straight line automatic positioning method towards vision line walking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant