CN111242000A - Road edge detection method combining laser point cloud steering - Google Patents

Road edge detection method combining laser point cloud steering Download PDF

Info

Publication number
CN111242000A
CN111242000A CN202010023329.5A CN202010023329A CN111242000A CN 111242000 A CN111242000 A CN 111242000A CN 202010023329 A CN202010023329 A CN 202010023329A CN 111242000 A CN111242000 A CN 111242000A
Authority
CN
China
Prior art keywords
road
point cloud
laser point
laser
edge
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010023329.5A
Other languages
Chinese (zh)
Inventor
马沪敏
陶冰洁
且若辰
彭真明
柳杨
程晓彬
王光慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202010023329.5A priority Critical patent/CN111242000A/en
Publication of CN111242000A publication Critical patent/CN111242000A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects

Abstract

The invention discloses a road edge detection method combining laser point cloud steering, belongs to the field of auxiliary driving, and solves the problems that the road edge detection is easily interfered by shadow in the visual road edge detection and the road edge is mistakenly detected due to vehicle steering. Firstly, laser point clouds and image data corresponding to the laser point clouds are subjected to combined calibration, a point cloud steering angle is calculated based on information entropy, so that laser points on the edge of a road are parallel to an X axis, and the steered point clouds are projected on a Y-X plane. Secondly, a road plane equation is constructed, and road point cloud is extracted by combining M estimation sample consistency and abnormal point processing. Secondly, the laser point cloud projected on the Y-X plane is constrained by using the road laser point and the laser radar position prior together, and a cumulative histogram of the constrained point cloud along the Y direction is constructed; determining laser points at the left and right edges of a road respectively at the left and right sides of the origin of the histogram; and finally, fitting the left edge and the right edge of the road based on the least square with the L2 regular constraint to finish the detection of the road edges. The invention is used for road edge detection.

Description

Road edge detection method combining laser point cloud steering
Technical Field
A road edge detection method combining laser point cloud steering is used for road edge detection and belongs to the field of auxiliary driving.
Background
Road detection is a basic research task in driving environment perception, and plays an important role in guaranteeing vehicle driving safety and realizing high-level automatic driving. The road edge is one of the most important features of the road, and the extraction of the road edge is beneficial to the subsequent road detection task. However, in real outdoor environments, road edge detection faces challenges of complex weather, illumination changes, shadow occlusion, and the like. Furthermore, for unstructured roads without lane markings, complete detection of road edges is still required. In the face of these problems, the road edge detection algorithm is required to be capable of resisting the interference of weather, illumination and shadow, stably and robustly detect the edge of the road, and prepare for the subsequent road detection task.
The road edge detection of the visual image is usually based on edge detection, for a road surface without shadow occlusion and with clear lane markings, the edge of the road is detected by combining color texture information, and then the final road edge is obtained based on hough transform. When the road surface is shaded by a large-area shadow or has no lane marking, the method cannot completely detect the edge of the road. The road edge detection method based on the vanishing point detection can resist the shadow interference to a certain extent, but the road edge detection method based on the vanishing point is essentially based on the Gabor filter bank to extract the edge texture information of the road, and is still unstable to the complex shadow interference. In addition, edge detection based on vanishing points continuously updates vanishing points and road surface edges by an iterative method to obtain correct road edges, with a huge amount of calculation and time consumption.
In recent years, the road edge detection method based on deep learning develops rapidly at any time, and the road edge detection method based on deep learning can obtain higher accuracy after the training of mass image data. However, the road edge detection method based on deep learning needs training of a large amount of various road data and needs continuous data updating, however, it is time-consuming and labor-consuming to calibrate the road edge data, and all road types cannot be included, and the generalization performance of the method has a certain limit. Furthermore, the interpretability of deep learning remains an unsolved problem.
The laser point cloud has the advantages of being free from illumination change, shadow shielding interference and the like all day long. The road edge detection based on the laser point cloud is not influenced by illumination change and shadow shielding. In addition, for the road edge detection without lane markings, the laser point cloud can still stably detect the road edge. The method comprises the steps that laser point cloud is decomposed into an elevation signal and a ground plane projection signal by the Wede Zhang (2010), a road candidate area is determined based on a local extreme value signal detection filter, the road area is determined based on a classification method by taking the elevation standard deviation of the candidate road area as a characteristic, the minimum width is set, finally, road points are projected to an image plane, and the final road edge is obtained based on Hough transform. The method can quickly and efficiently extract the edge of the road, but the method relates to adjustment of a plurality of parameters and has the problem that the parameters are difficult to adjust. Xiangdong Wang et al (2015) extracts road edge points based on kalman filtering, and associates a plurality of continuous road edge points to the same coordinate system through translation and conversion of coordinates based on GPS and gyroscope data. And finally, fitting the road edge points based on the improved RANSAC to obtain the road edge. According to the method, an accurate road edge detection result can be obtained only by a small amount of iteration, but laser point cloud data corresponding to continuous multi-frame images, GPS (global positioning system) data and gyroscope data are required. The road edge extraction effect of the laser radar point cloud data corresponding to the single-frame image still needs to be improved. Rachmadi R F and the like (2017) propose a 3D laser point cloud road edge extraction method based on a coding and decoding network, and the method is used as the input and the output of a network model based on a 3D voxel form to realize end-to-end road edge detection, but the method is over-fitted due to unbalance of positive and negative samples of a used data set. Lu X et al (2019) propose a method for line segmentation based on 3D discrete laser points to detect the edge lines of 3D point clouds. The author firstly segments planes based on discrete point clouds, then projects each 3D plane on a 2D image plane, extracts contour lines on the image plane, obtains line segmentation results based on a least square method, and then projects the 2D line segmentation results into the 3D planes to obtain 3D line segmentation results. The method based on the point cloud histogram can quickly determine the laser points at the road edge, but because the laser points are not accumulated at the road edge due to the steering problem of the laser point cloud, the laser points corresponding to the peak value intervals obtained according to the left side and the right side of the origin are not correct laser points at the road edge, and false detection is caused.
The road edge is one of the most important characteristics of the road, and the simple and efficient road edge detection can enable the subsequent road detection to be more stable. However, the road edge detection based on the visual image has the problems of being easily influenced by illumination change, shadow blocking interference and the like. The laser point cloud data has strong advantages in the aspects of resisting illumination change and shadow shielding, although the road edge detection based on the point cloud has the problems that the parameters are difficult to adjust or the laser point cloud data corresponding to multi-frame image data is needed, and the like. However, the laser point cloud-based road edge detection still has great potential and advantages in detection robustness, and the performance of the laser point cloud-based road edge detection can be still continuously improved, so that a solid foundation is provided for a road detection task.
Disclosure of Invention
Aiming at the problems of the research, the invention aims to provide a road edge detection method combining laser point cloud steering, solves the problems of edge false detection and difficult parameter adjustment caused by laser point cloud steering in the prior art and needs multiple frames of laser point clouds corresponding to image data to obtain a stable road edge detection result, and realizes a self-adaptive road edge detection method.
In order to achieve the purpose, the invention adopts the following technical scheme:
a road edge detection method combining laser point cloud steering comprises the following steps:
step 1: according to the spatial position relationship between the laser point cloud and the image, performing combined calibration on the point cloud data acquired by the laser radar and the image data acquired by the camera to acquire calibrated laser point cloud, removing invalid laser point cloud beyond the field of view of the image, and keeping the valid laser point cloud;
step 2: calculating a point cloud steering angle based on the information entropy, rotating the laser point cloud anticlockwise, wherein the steering angle corresponding to the minimum entropy value is the point cloud steering angle to be solved, so that the laser point on the edge of the road is rotated to be parallel to the X axis, and the subsequent extraction of the laser point on the edge of the road is facilitated;
and step 3: constructing a road surface equation based on road plane characteristics and effective laser point clouds, estimating road plane equation parameters based on M estimation sample consistency, extracting road area point clouds, wherein a small number of abnormal points usually exist in the road plane fitting process, and removing the abnormal points by deleting 5% of the laser point clouds with the maximum and minimum Y directions in the abnormal point processing of the laser point clouds to obtain the road laser point clouds;
and 4, step 4: determining an approximate interval of the road point cloud based on the road laser point cloud, and further constraining the point cloud based on the position prior of the laser radar;
according to the position prior of the laser radar and the road laser point cloud, jointly constraining effective laser point cloud, weakening the interference of non-road point cloud on the edge points of the subsequent road, and then accumulating the constrained road laser point cloud along the Y direction to obtain a point cloud accumulated histogram along the Y direction;
and 5: according to the characteristic that the left and right edges of the road are positioned at the left and right sides of the laser radar, respectively searching laser point clouds at the left and right sides of the origin of the point cloud accumulation histogram along the Y direction, after obtaining laser points at the edge of the road, clockwise rotating the laser point clouds along the point cloud steering angle obtained in the step 2, and recovering the space position coordinates of the road before the laser point clouds are rotated;
step 6: converting laser point clouds on the left edge and the right edge of the three-dimensional road into two-dimensional image plane projection points based on the mapping relation between the laser point clouds and an image plane, performing road edge fitting within 10 orders based on a least square method with L2 regular constraint, and selecting a road edge fitting result corresponding to the minimum value of a cost function;
and 7: and projecting the left and right edges of the road on the image surface to finish the road edge detection based on the laser point cloud.
Further, the specific steps of step 1 are as follows:
step 1.1: according to the spatial position relationship between the laser point cloud and the image, a rotation calibration matrix and a conversion matrix from the laser point cloud to the image are obtained, a mapping relationship between the laser point cloud and the image plane is established, and the projection coordinates of the three-dimensional laser point cloud on the two-dimensional image plane can be expressed as:
Figure BDA0002361581830000041
c (x, y, z, u, v) represents a laser point coordinate after the mapping relation between the laser point cloud and the image is established; c (x, y, z) represents the three-dimensional spatial coordinates of the laser spot; c (u, v) represents projection coordinates of the laser point projected on the two-dimensional image plane;
Figure BDA0002361581830000042
is a rotational calibration matrix;
Figure BDA0002361581830000043
a conversion matrix for converting the three-dimensional laser point velo to the image coordinate cam; p (x, y, z) is the three-dimensional laser point coordinates.
Step 1.2: and according to the relation between the two-dimensional projection coordinate of the laser point cloud and the image size, removing invalid laser point clouds exceeding the image plane, and keeping the valid laser point clouds.
Further, the specific steps of step 2 are as follows:
step 2.1: rotating the laser point cloud anticlockwise, performing histogram accumulation on the laser point cloud along the Y direction, calculating the information entropy of the laser point cloud accumulated histogram under different steering angles, and expressing the characteristic of data chaos degree by using the information entropy, when the road side is parallel to the X axis along the laser point, the information entropy of the projection histogram along the Y direction is minimum, selecting the steering angle corresponding to the minimum entropy value, so that the road laser point is parallel to the X axis, facilitating the extraction of the laser point at the edge of the follow-up road, and calculating the laser point cloud steering angle as shown in formula (2):
Figure BDA0002361581830000044
wherein, thetasteerRepresenting the point cloud steering angle corresponding to the minimum entropy value; h (theta) represents the information entropy of the accumulated histogram of the laser point cloud in the Y direction when the steering angle is theta; p is a radical ofθ(i) When the steering angle is theta, calculating the laser point cloud accumulated histogram along the Y direction, and assuming that the height Y of the laser point is in the interval [1, n ]]And the probability of occurrence of a laser spot with a height Y of i.
Step 2.1: after the laser point cloud steering angle is determined, the laser point cloud is rotated anticlockwise along the steering angle, so that the laser point at the edge of the road is parallel to the X axis, and the occurrence of false detection is reduced. The counterclockwise rotation calculation of the laser point cloud is shown in formulas (3) to (4):
Figure BDA0002361581830000051
Psteer(x,y,z)=P(x,y,z)*MAnti-clockwise(4)
wherein M isAnti-clockwisesteer) Indicating steering angle thetasteerA rotation transformation matrix rotated counterclockwise; thetasteerRepresenting a point cloud steering angle; p (x, y, z) represents the original effective laser point cloud; psteer(x, y, z) represents the steered effective laser point cloud.
Further, the specific steps of step 3 are as follows:
step 3.1: the method comprises the following steps of constructing a ground plane equation together with effective laser point cloud based on the characteristic that road plane points are usually in the same plane:
ax+by+cz+d=0 (5)
wherein, (a, b, c, d) parameters of the ground plane equation;
step 3.2: performing road plane equation parameter estimation based on MSAC, wherein MSAC realizes the estimation of road plane equation parameters by repeatedly selecting a group of random road laser point cloud subsets in laser point cloud data, the selected random subsets are assumed as data interior points, namely road points, and the data points are used for estimating the parameters of the road plane equation to obtain the estimated road plane equation;
testing the rest data points by using the estimated road plane equation, and if a certain point meets the road plane equation, considering the point as a road point;
if enough points are grouped into the assumed road plane points, the estimated plane equation is reasonable enough. Then, the road plane equation is re-estimated by using the assumed road points, and the reasonability of the road plane model is evaluated by using a loss function, wherein the loss function of the MSAC is shown in formulas (6) to (7):
Figure BDA0002361581830000052
Figure BDA0002361581830000061
wherein C represents a loss function,
Figure BDA0002361581830000062
representing the punishment of the model fitting interior point data; t is2Representing a threshold divided into inliers; e.g. of the type2Representing an error function for judging the data point as an inner point, and measuring the distance from the data point to the ground level of the terrace; (x)1,y1,z1) Three-dimensional spatial coordinates representing data points;
step 3.3: in the process of road plane fitting based on MSAC, 5% of laser points with the largest and smallest laser points in the Y-axis direction of the laser point cloud are deleted to complete processing of abnormal laser points, and the road laser point cloud is obtained.
Further, the specific steps of step 3 are as follows:
step 3.1: and determining an approximate interval of the position of the road area according to the road laser point cloud, and rejecting the non-road laser point cloud. Meanwhile, the collected laser point cloud takes the position of the laser radar as an origin, so that the laser point larger than the origin is not necessarily a road area laser point along the Z-axis direction, and the interference of the non-road area point cloud on the determination of the follow-up road edge is further reduced according to the priori knowledge. Therefore, the interference of non-road point cloud is removed through the common constraint of the road plane point and the laser radar position prior.
Step 3.2: and accumulating the effective laser points with reduced non-road area interference along the Y direction to obtain a Y-direction point cloud accumulated histogram.
Further, the step 4 is as follows:
step 4.1: determining the position of a road area according to a Y-direction coordinate interval of the road laser point cloud, eliminating non-road laser point cloud exceeding a certain range of the interval, reducing the interference of the non-road laser point cloud such as trees and buildings on two sides of a road to the extraction of road edge laser points, eliminating the laser point cloud which is inevitably not the road point cloud in the Z direction such as vehicle laser point cloud on a road surface through the laser point cloud position prior, and further reducing the interference;
therefore, the interference of non-road point cloud is removed by the common constraint of the road plane laser point and the laser radar position prior.
Step 4.2: and accumulating the effective laser points with reduced non-road area interference along the Y direction to obtain a Y-direction point cloud accumulated histogram.
Further, the specific steps of step 5 are:
step 5.1: after the laser point clouds are accumulated along the Y direction, a large amount of point cloud accumulation is formed in the road edge area, and the interference of non-road point cloud is eliminated, so that the final three-dimensional road left and right edge laser point clouds can be determined through the peak value intervals and the road edge width information on the left and right sides of the Y direction point cloud accumulation histogram origin.
Step 5.2: the three-dimensional road edge laser point is then rotated clockwise along the steering angle to restore the three-dimensional road edge laser point in the original space. The calculation of clockwise rotation of the laser point along the three-dimensional road edge is shown in equations (8) to (9).
Figure BDA0002361581830000071
PRoad(x,y,z)=P(x,y,z)Road-steer*MClockwisesteer) (9)
Wherein M isClockwisesteer) Indicating steering angle thetasteerA rotation transformation matrix that rotates clockwise; thetasteerRepresenting a point cloud steering angle; pRoad(x, y, z) represents the road edge laser point in the original space; p (x, y, z)Road-steerIndicating the steered road edge laser spot.
Further, the specific steps of step 6 are:
according to the mapping relation between the road laser point cloud and the image plane, two-dimensional image plane projection points corresponding to the three-dimensional road left and right edge laser point cloud are obtained, the two-dimensional image plane projection points of the road left and right edge laser point cloud are fitted within 10 orders sequentially based on a least square method with L2 regular constraint, the order and the fitting parameters corresponding to the minimum value of the cost function are selected, and the final road left and right edges are determined. The least squares with the L2 regular constraint are shown in equations (10) to (11):
Figure BDA0002361581830000072
Figure BDA0002361581830000073
wherein (u)1,v1) Representing a road left edge projection point on a two-dimensional image plane;
Figure BDA0002361581830000074
representing a two-dimensional image plane projection point corresponding to the ith left road edge point; (u)2,v2) Representing a road right edge projection point on a two-dimensional road image plane;
Figure BDA0002361581830000075
representing a two-dimensional image plane projection point corresponding to the jth right road edge point;
Figure BDA0002361581830000076
respectively representing cost functions of a least square method of L2 regular constraint of left and right road edge bands; (m, n) respectively representing the number of laser points at the left edge and the right edge of the road; (na, nb) respectively representing the number of parameters of the left edge and the right edge of the road; a isk、blRespectively representing the kth parameter of the left edge of the road and the l parameter of the right edge of the road; mu and lambda represent penalty parameters of L2 regular term and L1 regular term respectively
Further, the specific steps of step 7 are:
and projecting the left and right edges of the road on the image surface to finish the road edge detection based on the laser point cloud.
Compared with the prior art, the invention has the beneficial effects that:
1. compared with the road edge extraction based on visual images, the road edge extraction based on the laser point cloud can realize the road edge detection all day long without being interfered by illumination change and shadow;
2. the method is directly based on the 3D laser point cloud to carry out road edge detection, realizes extraction of road edge points in a 3D space, and comprehensively utilizes a data acquisition system and prior knowledge of a road area to realize a stable edge detection method. In the invention, the related parameters comprise parameters of a road plane equation and the road edge width, the parameters of the road plane equation are solved based on MSAC (MSAC estimation sample consistency), the road edge width is set based on the priori knowledge of the road edge width, and the influence on the extraction precision of the road edge is small. Therefore, the method does not need complicated manual parameter adjustment, and solves the problems of multiple parameters and difficult parameter adjustment in the extraction of the road edge points;
3. according to the invention, before the laser points at the left and right edges of the road are extracted, the point cloud steering angle is calculated based on the information entropy, and the point cloud is steered, so that the laser point cloud of the road is parallel to the X axis, and the problem of false detection of the laser points at the edges of the road caused by point cloud steering is avoided.
4. When the road edge function is fitted, based on least square fitting with L2 regular constraint, the order and fitting parameters corresponding to the minimum value of the least square fitting cost function with L2 regular constraint are selected in a self-adaptive mode, optimal road edge fitting function matching is achieved, and road edge detection is completed.
5. According to the method, the road edge is extracted only based on the laser point cloud data corresponding to the single-frame image, the robust edge extraction effect can be obtained only by dense laser point clouds or laser point clouds corresponding to continuous multi-frame images in the conventional road edge extraction, and the laser point clouds corresponding to the single-frame image data are easy to be interfered by non-road area laser point clouds on the road edge due to lack of inter-frame time sequence information. In the method for projecting the dense point cloud to the image surface for edge extraction, the road edge is incomplete or even can not be detected due to the characteristics of the sparsity of the point cloud, the existence of abnormal points and the close, close and distant sparseness of the point cloud. The invention comprehensively utilizes the prior knowledge of a data acquisition system and a road area, adds point cloud steering, does not need laser point cloud data corresponding to continuous image frames and is only based on sparse laser point cloud, and finally realizes a simple and efficient road edge detection method based on least square fitting with regular constraint.
6. The key issue to consider for point cloud rotation is how to determine the correct point cloud steering angle. The characteristic that the information entropy is used for expressing the data chaos degree is utilized, when the laser point on the road edge is parallel to the x axis, the chaos degree of the laser point cloud projection on the Y axis is minimum, and the point cloud steering angle is extracted in a self-adaptive mode. The key point lies in the determination of the steering angle, and the method of manually selecting the steering angle can only be staged and time-consuming, namely, the steering angle of the point cloud needs to be determined manually first, and the subsequent operation is carried out. The method disclosed by the invention can self-adaptively select the angle and rotate, and simultaneously directly carry out the road edge extraction work of the next step.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a frame of image in the present invention;
FIG. 3 is a three-dimensional schematic view of an uncalibrated laser point cloud corresponding to FIG. 2 of the present invention;
FIG. 4 is a two-dimensional schematic diagram of the Y-X plane of the uncalibrated laser point cloud corresponding to FIG. 2 in accordance with the present invention;
FIG. 5 is a schematic diagram showing a comparison of the steered uncalibrated laser point cloud Y-X planar two-dimensional projection of FIG. 2 with the uncalibrated laser point cloud Y-X planar two-dimensional projection of FIG. 2, wherein red represents the steered uncalibrated laser point cloud Y-X planar two-dimensional projection, and blue represents the uncalibrated laser point cloud Y-X planar two-dimensional projection;
FIG. 6 is a two-dimensional schematic diagram of a Y-X plane of a road laser point cloud corresponding to FIG. 2 without removing abnormal laser points after steering; x represents the X axis in the three-dimensional point cloud space, Y is the Y axis in the three-dimensional point cloud space, and fig. 6 represents the two-dimensional projection of the road plane laser point cloud on the YX plane obtained by estimating the sample consistency by M.
FIG. 7 is a two-dimensional schematic view of a steered off-road laser point cloud Y-X plane corresponding to FIG. 2 of the present invention; x represents the X-axis in the three-dimensional point cloud space, Y is the Y-axis in the three-dimensional point cloud space, and fig. 6 represents the two-dimensional projection of the non-road area laser point cloud on the YX plane obtained by estimating the sample consistency by M.
FIG. 8 is a two-dimensional schematic view of a Y-X plane of a road laser point cloud corresponding to FIG. 2 after steering and with abnormal laser points removed; and X represents an X axis in the three-dimensional point cloud space, Y is a Y axis in the three-dimensional point cloud space, and the two-dimensional projection schematic diagram of the YX plane of the road laser point cloud after the abnormal laser point is removed is compared with the two-dimensional projection schematic diagram of the YX plane of the road laser point cloud without the abnormal laser point in the figure 6.
FIG. 9 is a final cumulative histogram along the Y direction of the laser point cloud corresponding to FIG. 2 according to the present invention;
fig. 10 is a road edge detection result of the laser point cloud corresponding to fig. 2 in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the detailed description and specific examples, while indicating the preferred embodiment of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It is noted that relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The features and properties of the present invention are described in further detail below with reference to examples.
A road visual detection method combining laser point cloud data comprises the following steps:
step 1: according to the spatial position relationship between the laser point cloud and the image, performing combined calibration on the point cloud data acquired by the laser radar and the image data acquired by the camera to acquire calibrated laser point cloud, removing invalid laser point cloud beyond the field of view of the image, and keeping the valid laser point cloud;
further, the specific steps of step 1 are as follows:
step 1.1: according to the spatial position relationship between the laser point cloud and the image, a rotation calibration matrix and a conversion matrix from the laser point cloud to the image are obtained, a mapping relationship between the laser point cloud and the image plane is established, and the projection coordinates of the three-dimensional laser point cloud on the two-dimensional image plane can be expressed as:
Figure BDA0002361581830000111
c (x, y, z, u, v) represents a laser point coordinate after the mapping relation between the laser point cloud and the image is established; c (x, y, z) represents the three-dimensional spatial coordinates of the laser spot; c (u, v) represents projection coordinates of the laser point projected on the two-dimensional image plane;
Figure BDA0002361581830000112
is a rotational calibration matrix;
Figure BDA0002361581830000113
a conversion matrix for converting the three-dimensional laser point velo to the image coordinate cam; p (x, y, z) is the three-dimensional laser point coordinates.
Step 1.2: and according to the relation between the two-dimensional projection coordinate of the laser point cloud and the image size, removing invalid laser point clouds exceeding the image plane, and keeping the valid laser point clouds.
Step 2: calculating a point cloud steering angle based on the information entropy, rotating the laser point cloud anticlockwise, wherein the steering angle corresponding to the minimum entropy value is the point cloud steering angle to be solved, so that the laser point on the edge of the road is rotated to be parallel to the X axis, and the subsequent extraction of the laser point on the edge of the road is facilitated;
further, the specific steps of step 2 are as follows:
step 2.1: rotating the laser point cloud anticlockwise, performing histogram accumulation on the laser point cloud along the Y direction, calculating the information entropy of the histogram accumulation of the laser point cloud under different steering angles, selecting the steering angle corresponding to the minimum entropy value, and facilitating the extraction of the laser point at the edge of the follow-up road, wherein the calculation of the steering angle of the laser point cloud is shown as a formula (2):
Figure BDA0002361581830000114
wherein, thetasteerRepresenting the point cloud steering angle corresponding to the minimum entropy value; h (theta) represents the information entropy of the accumulated histogram of the laser point cloud in the Y direction when the steering angle is theta; p is a radical ofθ(i) When the steering angle is theta, calculating the laser point cloud accumulated histogram along the Y direction, and assuming that the height Y of the laser point is in the interval [1, n ]]And the probability of occurrence of a laser spot with a height Y of i.
Step 2.1: after the laser point cloud steering angle is determined, the laser point cloud is rotated anticlockwise along the steering angle, so that the laser point at the edge of the road is parallel to the X axis, and the occurrence of false detection is reduced. The counterclockwise rotation calculation of the laser point cloud is shown in formulas (3) to (4):
Figure BDA0002361581830000121
Psteer(x,y,z)=P(x,y,z)*MAnti-clockwise(4)
wherein M isAnti-clockwisesteer) Indicating steering angle thetasteerA rotation transformation matrix rotated counterclockwise; thetasteerRepresenting a point cloud steering angle; p (x, y, z) represents the original effective laser point cloud; psteer(x, y, z) represents the steered effective laser point cloud.
And step 3: constructing a road surface equation based on road plane characteristics and effective laser point clouds, estimating road plane equation parameters based on M estimation sample consistency, extracting road area point clouds, wherein a small number of abnormal points usually exist in the road plane fitting process, and removing the abnormal points by deleting 5% of the laser point clouds with the maximum and minimum Y directions in the abnormal point processing of the laser point clouds to obtain the road laser point clouds;
further, the specific steps of step 3 are as follows:
step 3.1: the method comprises the following steps of constructing a ground plane equation together with effective laser point cloud based on the characteristic that road plane points are usually in the same plane:
ax+by+cz+d=0 (5)
wherein, (a, b, c, d) parameters of the ground plane equation;
step 3.2: performing road plane equation parameter estimation based on MSAC, wherein MSAC realizes the estimation of road plane equation parameters by repeatedly selecting a group of random road laser point cloud subsets in laser point cloud data, the selected random subsets are assumed as data interior points, namely road points, and the data points are used for estimating the parameters of the road plane equation to obtain the estimated road plane equation;
testing the rest data points by using the estimated road plane equation, and if a certain point meets the road plane equation, considering the point as a road point;
if enough points are grouped into the assumed road plane points, the estimated plane equation is reasonable enough. Then, the road plane equation is re-estimated by using the assumed road points, and the reasonability of the road plane model is evaluated by using a loss function, wherein the loss function of the MSAC is shown in formulas (6) to (7):
Figure BDA0002361581830000122
Figure BDA0002361581830000131
wherein C represents a loss function,
Figure BDA0002361581830000132
representing the punishment of the model fitting interior point data; t is2Representing a threshold divided into inliers; e.g. of the type2Representing an error function for judging the data point as an inner point, and measuring the distance from the data point to the ground level of the terrace; (x)1,y1,z1) Three-dimensional spatial coordinates representing data points;
step 3.3: in the process of road plane fitting based on MSAC, 5% of laser points with the largest and smallest laser points in the Y-axis direction of the laser point cloud are deleted to complete processing of abnormal laser points, and the road laser point cloud is obtained.
And 4, step 4: determining an approximate interval of the road point cloud based on the road laser point cloud, and further constraining the point cloud based on the position prior of the laser radar;
according to the position prior of the laser radar and the road laser point cloud, jointly constraining effective laser point cloud, weakening the interference of non-road point cloud on the edge points of the subsequent road, and then accumulating the constrained road laser point cloud along the Y direction to obtain a point cloud accumulated histogram along the Y direction;
further, the specific steps of step 4 are as follows:
step 4.1: and determining an approximate interval of the position of the road area according to the road laser point cloud, and rejecting the non-road laser point cloud. Meanwhile, the collected laser point cloud takes the position of the laser radar as an origin, so that the laser point larger than the origin is not necessarily a road area laser point along the Z-axis direction, and the interference of the non-road area point cloud on the determination of the follow-up road edge is further reduced according to the priori knowledge. Therefore, the interference of non-road point cloud is removed through the common constraint of the road plane point and the laser radar position prior.
Step 4.2: and accumulating the effective laser points with reduced non-road area interference along the Y direction to obtain a Y-direction point cloud accumulated histogram.
And 5: according to the characteristic that the left and right edges of the road are positioned at the left and right sides of the laser radar, respectively searching laser point clouds at the left and right sides of the origin of the point cloud accumulation histogram along the Y direction, after obtaining laser points at the edge of the road, clockwise rotating the laser point clouds along the point cloud steering angle obtained in the step 2, and recovering the space position coordinates of the road before the laser point clouds are rotated;
further, the step 5 is as follows:
step 5.1: after the laser point clouds are accumulated along the Y direction, a large amount of point cloud accumulation is formed in the road edge area, and the interference of non-road point cloud is eliminated, so that the final three-dimensional road left and right edge laser point clouds can be determined through the peak value intervals and the road edge width information on the left and right sides of the Y direction point cloud accumulation histogram origin.
Step 5.2: the three-dimensional road edge laser point is then rotated clockwise along the steering angle to restore the three-dimensional road edge laser point in the original space. The calculation of clockwise rotation of the laser point along the three-dimensional road edge is shown in equations (8) to (9).
Figure BDA0002361581830000141
PRoad(x,y,z)=P(x,y,z)Road-steer*MClockwisesteer) (9)
Wherein M isClockwisesteer) Indicating steering angle thetasteerA rotation transformation matrix that rotates clockwise; thetasteerRepresenting a point cloud steering angle; pRoad(x, y, z) represents the road edge laser point in the original space; p (x, y, z)Road-steerIndicating the steered road edge laser spot.
Step 6: converting laser point clouds on the left edge and the right edge of the three-dimensional road into two-dimensional image plane projection points based on the mapping relation between the laser point clouds and an image plane, performing road edge fitting within 10 orders based on a least square method with L2 regular constraint, and selecting a road edge fitting result corresponding to the minimum value of a cost function;
further, the specific steps of step 6 are:
according to the mapping relation between the road laser point cloud and the image plane, two-dimensional image plane projection points corresponding to the three-dimensional road left and right edge laser point cloud are obtained, the two-dimensional image plane projection points of the road left and right edge laser point cloud are fitted within 10 orders sequentially based on a least square method with L2 regular constraint, the order and the fitting parameters corresponding to the minimum value of the cost function are selected, and the final road left and right edges are determined. The least squares with the L2 regular constraint are shown in equations (10) to (11):
Figure BDA0002361581830000142
Figure BDA0002361581830000143
wherein the input to the function is the x-coordinate of the two-dimensional projection point of the laser spot. That is, it corresponds to the input of u1, u2, x1 and x2 representing functions in the left and right laser spot projection of the road, which is only a representation method, (u 2)1,v1) Representing a road left edge projection point on a two-dimensional image plane;
Figure BDA0002361581830000151
representing a two-dimensional image plane projection point corresponding to the ith left road edge point; (u)2,v2) Representing a road right edge projection point on a two-dimensional road image plane;
Figure BDA0002361581830000152
representing a two-dimensional image plane projection point corresponding to the jth right road edge point;
Figure BDA0002361581830000153
respectively representing cost functions of a least square method of L2 regular constraint of left and right road edge bands; (m, n) respectively representing the number of laser points at the left edge and the right edge of the road; (na, nb) respectively representing the number of parameters of the left edge and the right edge of the road; a isk、blRespectively representing the kth parameter of the left edge of the road and the l parameter of the right edge of the road; mu and lambda represent penalty parameters of L2 regular term and L1 regular term respectively
And 7: and projecting the left and right edges of the road on the image surface to finish the road edge detection based on the laser point cloud.
Further, the specific steps of step 7 are:
and projecting the left and right edges of the road on the image surface to finish the road edge detection based on the laser point cloud.
Examples
Step 1: and performing joint calibration on the point cloud data acquired by the laser radar and the image data acquired by the camera according to the spatial position relationship between the laser point cloud and the image to acquire calibrated laser point cloud, removing invalid laser point cloud beyond the field of view of the image, and keeping the valid laser point cloud. Wherein the image data is as shown in FIG. 2; FIG. 2 is a three-dimensional schematic of the uncalibrated point cloud shown in FIG. 3; FIG. 2 is a schematic Y-X two-dimensional plan view of the uncalibrated point cloud of FIG. 4;
step 2: and calculating a point cloud steering angle based on the information entropy, rotating the laser point cloud anticlockwise, wherein the steering angle corresponding to the minimum entropy value is the point cloud steering angle to be solved, so that the laser point on the edge of the road is rotated to be parallel to the X axis, and the subsequent extraction of the laser point on the edge of the road is facilitated. A schematic diagram comparing the steered uncalibrated laser point cloud Y-X planar two-dimensional projection corresponding to FIG. 2 with the uncalibrated laser point cloud Y-X planar two-dimensional projection corresponding to FIG. 2 is shown in FIG. 5, wherein red represents the steered uncalibrated laser point cloud Y-X planar two-dimensional projection, and blue represents the uncalibrated laser point cloud Y-X planar two-dimensional projection;
and step 3: the method comprises the steps of constructing a road surface equation based on road plane characteristics and effective laser point clouds, estimating road plane equation parameters based on M estimation sample consistency, extracting road area point clouds, generally having a small number of abnormal points in the road plane fitting process, and removing the abnormal points by deleting 5% of the laser point clouds with the maximum and minimum Y directions in the abnormal point processing of the laser point clouds to obtain the road laser point clouds. Wherein, the two-dimensional schematic diagram of the road laser point cloud Y-X plane without removing the abnormal laser point after turning corresponding to FIG. 2 is shown in FIG. 6; FIG. 2 is a two-dimensional schematic diagram of a Y-X plane of a steered off-road laser point cloud corresponding to FIG. 7; FIG. 2 is a two-dimensional schematic diagram of a Y-X plane of a road laser point cloud with abnormal laser points removed after steering, as shown in FIG. 8;
and 4, step 4: determining an approximate interval of the road point cloud based on the road laser point cloud, and further constraining the point cloud based on the position prior of the laser radar;
according to the position prior of the laser radar and the road laser point cloud, the effective laser point cloud is jointly constrained, the interference of non-road point cloud on the edge points of the follow-up road is weakened, then the constrained road laser point cloud is accumulated along the Y direction, and a point cloud accumulated histogram along the Y direction is obtained. Fig. 9 shows the final cumulative histogram along the Y direction corresponding to the laser point cloud, where the horizontal axis represents the distance from the laser point to the origin along the Y direction, and the vertical axis represents the number of laser points at different distances;
and 5: according to the characteristic that the left and right edges of the road are positioned at the left and right sides of the laser radar, respectively searching laser point clouds at the left and right sides of the origin of the point cloud accumulation histogram along the Y direction, after obtaining laser points at the edge of the road, clockwise rotating the laser point clouds along the point cloud steering angle obtained in the step 2, and recovering the space position coordinates of the road before the laser point clouds are rotated;
step 6: converting laser point clouds on the left edge and the right edge of the three-dimensional road into two-dimensional image plane projection points based on the mapping relation between the laser point clouds and an image plane, performing road edge fitting within 10 orders based on a least square method with L2 regular constraint, and selecting a road edge fitting result corresponding to the minimum value of a cost function;
and 7: and projecting the left and right edges of the road on the image plane onto the image plane to complete the road edge detection based on the laser point cloud, wherein the road edge detection result is shown in fig. 10.
The above are merely representative examples of the many specific applications of the present invention, and do not limit the scope of the invention in any way. All the technical solutions formed by the transformation or the equivalent substitution fall within the protection scope of the present invention.

Claims (8)

1. A road edge detection method combining laser point cloud steering is characterized in that: the method comprises the following steps:
step 1: according to the spatial position relationship between the laser point cloud and the image, performing combined calibration on the point cloud data acquired by the laser radar and the image data acquired by the camera to acquire calibrated laser point cloud, removing invalid laser point cloud beyond the field of view of the image, and keeping the valid laser point cloud;
step 2: calculating a laser point cloud steering angle based on the information entropy, and rotating the laser point cloud anticlockwise according to the laser point cloud steering angle to enable the laser point on the road edge to rotate to be parallel to the X axis so as to obtain an effective laser point cloud after steering;
and step 3: constructing a road surface equation based on road plane characteristics and effective laser point clouds, estimating road plane equation parameters based on M estimation sample consistency, extracting road area point clouds, wherein a small number of abnormal points usually exist in the road plane fitting process, and removing the abnormal points by deleting 5% of the laser point clouds with the maximum and minimum Y directions in the abnormal point processing of the laser point clouds to obtain the road laser point clouds;
and 4, step 4: determining an approximate interval of the road point cloud based on the road laser point cloud, and further constraining the point cloud based on the position prior of the laser radar:
according to the position prior of the laser radar and the road laser point cloud, jointly constraining effective laser point cloud, weakening the interference of non-road point cloud on the edge points of the subsequent road, and then accumulating the constrained road laser point cloud along the Y direction to obtain a point cloud accumulated histogram along the Y direction;
and 5: according to the characteristic that the left and right edges of the road are positioned at the left and right sides of the laser radar, respectively searching laser point clouds at the left and right sides of the origin of the point cloud accumulation histogram along the Y direction, after obtaining laser points at the edge of the road, clockwise rotating the laser point clouds along the point cloud steering angle obtained in the step 2, and recovering the space position coordinates of the road before the laser point clouds are rotated;
step 6: converting the laser point clouds at the left edge and the right edge of the three-dimensional road into two-dimensional image plane projection points based on the mapping relation between the laser point clouds and an image plane, performing road edge fitting within n orders based on a least square method with L2 regular constraint, selecting road edge fitting function parameters and orders corresponding to the minimum value of a cost function, and constructing a road left edge function and a road right edge function to obtain road left edge and road right edge fitting results;
and 7: and projecting the fitting result of the left and right edges of the road on the image plane onto the image plane to finish the road edge detection based on the laser point cloud.
2. The method for detecting the road edge by combining the laser point cloud steering as claimed in claim 1, wherein the method comprises the following steps: the specific steps of the step 1 are as follows:
step 1.1: according to the spatial position relationship between the laser point cloud and the image, a rotation calibration matrix and a conversion matrix from the laser point cloud to the image are obtained, a mapping relationship between the laser point cloud and the image plane is established, and the projection coordinates of the three-dimensional laser point cloud on the two-dimensional image plane can be expressed as:
Figure FDA0002361581820000021
c (x, y, z, u, v) represents a laser point coordinate after the mapping relation between the laser point cloud and the image is established; c (x, y, z) represents the three-dimensional spatial coordinates of the laser spot; c (u, v) represents projection coordinates of the laser point projected on the two-dimensional image plane;
Figure FDA0002361581820000022
is a rotational calibration matrix;
Figure FDA0002361581820000023
a conversion matrix for converting the three-dimensional laser point velo to the image coordinate cam; p (x, y, z) is a three-dimensional laser point coordinate;
step 1.2: and according to the relation between the two-dimensional projection coordinate of the laser point cloud and the image size, removing invalid laser point clouds exceeding the image plane, and keeping the valid laser point clouds.
3. The method for detecting the road edge by combining the laser point cloud steering as claimed in claim 2, wherein the method comprises the following steps: the specific steps of the step 2 are as follows:
step 2.1: rotating the laser point cloud anticlockwise, performing histogram accumulation on the laser point cloud along the Y direction, calculating the information entropy of the histogram accumulation of the laser point cloud under different steering angles, selecting the steering angle corresponding to the minimum entropy value as the steering angle of the laser point cloud, facilitating the extraction of the laser point at the edge of a follow-up road, and calculating the steering angle of the laser point cloud according to a formula (2):
Figure FDA0002361581820000024
wherein, thetasteerRepresenting the point cloud steering angle corresponding to the minimum entropy value; h (theta) represents the information entropy of the accumulated histogram of the laser point cloud in the Y direction when the steering angle is theta; p is a radical ofθ(i) Representing the probability of the occurrence of a laser point with the distance i between the Y direction and the origin;
step 2.1: after the laser point cloud steering angle is determined, the laser point cloud is rotated anticlockwise along the steering angle, so that the laser points on the edge of the road are parallel to the X axis, the occurrence of false detection is reduced, and the anticlockwise rotation calculation of the laser point cloud is shown in formulas (3) to (4):
Figure FDA0002361581820000025
Psteer(x,y,z)=P(x,y,z)*MAnti-clockwise(4)
wherein M isAnti-clockwisesteer) Indicating steering angle thetasteerA rotation transformation matrix rotated counterclockwise; thetasteerRepresenting a point cloud steering angle; p (x, y, z) represents the original effective laser point cloud; psteer(x, y, z) represents the steered effective laser point cloud.
4. The method for detecting the road edge by combining the laser point cloud steering as claimed in claim 3, wherein the method comprises the following steps: the specific steps of the step 3 are as follows:
step 3.1: the method comprises the following steps of constructing a ground plane equation together with effective laser point cloud based on the characteristic that road plane points are usually in the same plane:
ax+by+cz+d=0 (5)
wherein, (a, b, c, d) parameters of the ground plane equation;
step 3.2: performing road plane equation parameter estimation based on MSAC, wherein MSAC realizes the estimation of road plane equation parameters by repeatedly selecting a group of random road laser point cloud subsets in laser point cloud data, the selected random subsets are assumed as data interior points, namely road points, and the data points are used for estimating the parameters of the road plane equation to obtain the estimated road plane equation;
testing the rest data points by using the estimated road plane equation, and if a certain point meets the road plane equation, considering the point as a road point;
if enough points are classified as the assumed road plane points, the estimated plane equation is reasonable enough, then the assumed road points are used to estimate the road plane equation again, and the reasonableness of the road plane model is evaluated by a loss function of the MSAC as shown in equations (6) to (7):
Figure FDA0002361581820000031
Figure FDA0002361581820000032
wherein C represents a loss function,
Figure FDA0002361581820000033
representing the punishment of the model fitting interior point data; t is2Representing a threshold divided into inliers; e.g. of the type2Representing an error function for judging the data point as an inner point, and measuring the distance from the data point to the ground level of the terrace; (x)1,y1,z1) Three-dimensional spatial coordinates representing data points;
step 3.3: in the process of road plane fitting based on MSAC, 5% of laser points with the largest and smallest laser points in the Y-axis direction of the laser point cloud are deleted to complete processing of abnormal laser points, and the road laser point cloud is obtained.
5. The method for detecting the road edge by combining the laser point cloud steering as claimed in claim 4, wherein the method comprises the following steps: the specific steps of the step 4 are as follows:
step 4.1: determining the position of a road area according to a Y-direction coordinate interval of the road laser point cloud, eliminating non-road laser point cloud exceeding a certain range of the interval, reducing the interference of the non-road laser point cloud such as trees and buildings on two sides of a road to the extraction of road edge laser points, eliminating the laser point cloud which is inevitably not the road point cloud in the Z direction such as vehicle laser point cloud on a road surface through the laser point cloud position prior, and further reducing the interference;
therefore, the interference of non-road point cloud is removed by the common constraint of the road plane laser point and the laser radar position prior;
step 4.2: and accumulating the effective laser points with reduced non-road area interference along the Y direction to obtain a Y-direction point cloud accumulated histogram.
6. The method for detecting the road edge by combining the laser point cloud steering as claimed in claim 5, wherein the method comprises the following steps: the step 5 is as follows:
step 5.1: after point cloud accumulation is carried out on the laser point clouds along the Y direction, a large amount of point cloud accumulation can be formed in the road edge area, and the interference of non-road point cloud is eliminated, so that the final three-dimensional road left and right edge laser point clouds can be determined through the peak value intervals and the road edge width information on the left and right sides of the origin of the Y-direction point cloud accumulation histogram;
step 5.2: subsequently, the three-dimensional road edge laser point is rotated clockwise along the steering angle to restore the three-dimensional road edge laser point in the original space, and the clockwise rotation of the three-dimensional road edge laser point is calculated as shown in formulas (8) to (9):
Figure FDA0002361581820000041
PRoad(x,y,z)=P(x,y,z)Road-steer*MClockwisesteer) (9)
wherein M isClockwisesteer) Indicating steering angle thetasteerA rotation transformation matrix that rotates clockwise; thetasteerRepresenting a point cloud steering angle; pRoad(x, y, z) represents the road edge laser point in the original space; p (x, y, z)Road-steerIndicating the steered road edge laser spot.
7. The method for detecting the road edge by combining the laser point cloud steering as claimed in claim 6, wherein the method comprises the following steps: the specific steps of the step 6 are as follows:
according to the mapping relation between the road laser point cloud and the image plane, obtaining two-dimensional image plane projection points corresponding to the three-dimensional road left and right edge laser point cloud, fitting the two-dimensional image plane projection points of the road left and right edge laser point cloud within n-order in sequence based on a least square method with L2 regular constraint, selecting an order and a fitting parameter corresponding to the minimum value of a cost function, determining the final road left and right edges, and calculating the least square with L2 regular constraint as shown in formulas (10) to (11):
Figure FDA0002361581820000051
ana,bnbas fitting parameters (10)
Figure FDA0002361581820000052
Wherein (u)1,v1) Representing a road left edge projection point on a two-dimensional image plane;
Figure FDA0002361581820000053
representing a two-dimensional image plane projection point corresponding to the ith left road edge point; (u)2,v2) Representing a road right edge projection point on a two-dimensional road image plane;
Figure FDA0002361581820000054
representing a two-dimensional image plane projection point corresponding to the jth right road edge point;
Figure FDA0002361581820000055
respectively representing cost functions of a least square method of L2 regular constraint of left and right road edge bands; (m, n) respectively representing the number of laser points at the left edge and the right edge of the road; (na, nb) respectively representing the number of parameters of the left edge and the right edge of the road; a isk、blAre respectively provided withExpressing the kth parameter of the left edge of the road and the l parameter of the right edge of the road; μ, λ represent penalty parameters for the L2 regular term and the L1 regular term, respectively.
8. The method for detecting the road edge by combining the laser point cloud steering as claimed in claim 7, wherein the method comprises the following steps: the specific steps of the step 7 are as follows:
and projecting the left and right edges of the road fitted by least squares based on the regular constraint with the L2 on the image to finish road edge detection combined with laser point cloud steering.
CN202010023329.5A 2020-01-09 2020-01-09 Road edge detection method combining laser point cloud steering Pending CN111242000A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010023329.5A CN111242000A (en) 2020-01-09 2020-01-09 Road edge detection method combining laser point cloud steering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010023329.5A CN111242000A (en) 2020-01-09 2020-01-09 Road edge detection method combining laser point cloud steering

Publications (1)

Publication Number Publication Date
CN111242000A true CN111242000A (en) 2020-06-05

Family

ID=70869447

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010023329.5A Pending CN111242000A (en) 2020-01-09 2020-01-09 Road edge detection method combining laser point cloud steering

Country Status (1)

Country Link
CN (1) CN111242000A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112258618A (en) * 2020-11-04 2021-01-22 中国科学院空天信息创新研究院 Semantic mapping and positioning method based on fusion of prior laser point cloud and depth map
CN112561808A (en) * 2020-11-27 2021-03-26 中央财经大学 Road boundary line restoration method based on vehicle-mounted laser point cloud and satellite image
CN113570629A (en) * 2021-09-28 2021-10-29 山东大学 Semantic segmentation method and system for removing dynamic objects
CN113689400A (en) * 2021-08-24 2021-11-23 凌云光技术股份有限公司 Method and device for detecting section contour edge of depth image
CN113838111A (en) * 2021-08-09 2021-12-24 北京中科慧眼科技有限公司 Road texture feature detection method and device and automatic driving system
CN114279392A (en) * 2021-12-27 2022-04-05 深圳市星卡科技有限公司 Method and device for calibrating steering angle sensor and computer equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107356933A (en) * 2017-06-23 2017-11-17 南京理工大学 A kind of unstructured road detection method based on four line laser radars
CN108470159A (en) * 2018-03-09 2018-08-31 腾讯科技(深圳)有限公司 Lane line data processing method, device, computer equipment and storage medium
CN109522804A (en) * 2018-10-18 2019-03-26 汽-大众汽车有限公司 A kind of road edge recognition methods and system
US20190258251A1 (en) * 2017-11-10 2019-08-22 Nvidia Corporation Systems and methods for safe and reliable autonomous vehicles
CN110175576A (en) * 2019-05-29 2019-08-27 电子科技大学 A kind of driving vehicle visible detection method of combination laser point cloud data
CN110378196A (en) * 2019-05-29 2019-10-25 电子科技大学 A kind of road vision detection method of combination laser point cloud data
CN110494863A (en) * 2018-03-15 2019-11-22 辉达公司 Determine autonomous vehicle drives free space
US20200160530A1 (en) * 2017-07-04 2020-05-21 Robert Bosch Gmbh Image analysis including targeted preprocessing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107356933A (en) * 2017-06-23 2017-11-17 南京理工大学 A kind of unstructured road detection method based on four line laser radars
US20200160530A1 (en) * 2017-07-04 2020-05-21 Robert Bosch Gmbh Image analysis including targeted preprocessing
US20190258251A1 (en) * 2017-11-10 2019-08-22 Nvidia Corporation Systems and methods for safe and reliable autonomous vehicles
CN108470159A (en) * 2018-03-09 2018-08-31 腾讯科技(深圳)有限公司 Lane line data processing method, device, computer equipment and storage medium
CN110494863A (en) * 2018-03-15 2019-11-22 辉达公司 Determine autonomous vehicle drives free space
CN109522804A (en) * 2018-10-18 2019-03-26 汽-大众汽车有限公司 A kind of road edge recognition methods and system
CN110175576A (en) * 2019-05-29 2019-08-27 电子科技大学 A kind of driving vehicle visible detection method of combination laser point cloud data
CN110378196A (en) * 2019-05-29 2019-10-25 电子科技大学 A kind of road vision detection method of combination laser point cloud data

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
P.H.S. TORR等: "MLESAC:A new robust estimator with application to estimating image geometry", 《COMPUTER VISION AND IMAGE UNDERSTANDING》 *
XIAO HU等: "Road Boundary Detection Based On Information Entropy", 《CHINESE CONTROL AND DECISION CONFERENCE》 *
单小军等: "图像匹配中误匹配点检测技术综述", 《计算机应用研究》 *
唐炉亮等: "基于约束高斯混合模型的车道信息获取", 《武汉大学学报(信息科学版)》 *
方积乾等: "《医学统计学手册》", 31 May 2018, 中国统计出版社 *
马沪敏: "基于激光雷达与视觉异源数据融合的道路检测技术研究", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技II辑》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112258618A (en) * 2020-11-04 2021-01-22 中国科学院空天信息创新研究院 Semantic mapping and positioning method based on fusion of prior laser point cloud and depth map
CN112258618B (en) * 2020-11-04 2021-05-14 中国科学院空天信息创新研究院 Semantic mapping and positioning method based on fusion of prior laser point cloud and depth map
CN112561808A (en) * 2020-11-27 2021-03-26 中央财经大学 Road boundary line restoration method based on vehicle-mounted laser point cloud and satellite image
CN112561808B (en) * 2020-11-27 2023-07-18 中央财经大学 Road boundary line restoration method based on vehicle-mounted laser point cloud and satellite image
CN113838111A (en) * 2021-08-09 2021-12-24 北京中科慧眼科技有限公司 Road texture feature detection method and device and automatic driving system
CN113689400A (en) * 2021-08-24 2021-11-23 凌云光技术股份有限公司 Method and device for detecting section contour edge of depth image
CN113689400B (en) * 2021-08-24 2024-04-19 凌云光技术股份有限公司 Method and device for detecting profile edge of depth image section
CN113570629A (en) * 2021-09-28 2021-10-29 山东大学 Semantic segmentation method and system for removing dynamic objects
CN114279392A (en) * 2021-12-27 2022-04-05 深圳市星卡科技有限公司 Method and device for calibrating steering angle sensor and computer equipment
CN114279392B (en) * 2021-12-27 2024-02-06 深圳市星卡科技股份有限公司 Calibration method and device for steering angle sensor and computer equipment

Similar Documents

Publication Publication Date Title
CN111242000A (en) Road edge detection method combining laser point cloud steering
CN111007531A (en) Road edge detection method based on laser point cloud data
CN111028277B (en) SAR and optical remote sensing image registration method based on pseudo-twin convolution neural network
CN109243289B (en) Method and system for extracting parking spaces of underground garage in high-precision map manufacturing
CN103400151B (en) The optical remote sensing image of integration and GIS autoregistration and Clean water withdraw method
CN110119438B (en) Airborne LiDAR point cloud filtering method based on active learning
CN111145228B (en) Heterologous image registration method based on fusion of local contour points and shape features
CN107610164B (en) High-resolution four-number image registration method based on multi-feature mixing
Yang et al. Fast and accurate vanishing point detection and its application in inverse perspective mapping of structured road
CN112767490B (en) Outdoor three-dimensional synchronous positioning and mapping method based on laser radar
CN104361590A (en) High-resolution remote sensing image registration method with control points distributed in adaptive manner
CN102236794A (en) Recognition and pose determination of 3D objects in 3D scenes
CN104217427A (en) Method for positioning lane lines in traffic surveillance videos
CN110428425B (en) Sea-land separation method of SAR image based on coastline vector data
CN104077760A (en) Rapid splicing system for aerial photogrammetry and implementing method thereof
CN103727930A (en) Edge-matching-based relative pose calibration method of laser range finder and camera
CN109461132B (en) SAR image automatic registration method based on feature point geometric topological relation
CN111028292A (en) Sub-pixel level image matching navigation positioning method
CN104121902A (en) Implementation method of indoor robot visual odometer based on Xtion camera
Qian et al. Robust visual-lidar simultaneous localization and mapping system for UAV
CN111354083A (en) Progressive building extraction method based on original laser point cloud
CN113177593A (en) Fusion method of radar point cloud and image data in water traffic environment
Wang Automatic extraction of building outline from high resolution aerial imagery
Zhang LILO: A Novel Lidar–IMU SLAM System With Loop Optimization
Yao et al. Automatic extraction of road markings from mobile laser-point cloud using intensity data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200605

RJ01 Rejection of invention patent application after publication